On viernes, 25 de junio de 2021 14:58:03 (CEST) Simon Avery wrote:

> No - the error happened today during the major update.


I can't reproduce your issue. For me the package installs without that warning.


Is there anything particular about that system? The complain looks like a problem with the keystore.


Can anyone reproduce that problem?


> Additionally, when I try to register or re-register clients (Centos 7, for

> example), it fails and salt/minion logs this, which feels related;

>

> Can I add this key manually?


I think your problem is not the GPG key, but the SSL certificate installed at the proxy the Proxy. Did you follow the instructions to deploy the proxy, or are you using your own SSL certificates?


>

>

> 2021-06-25 13:43:32,162 [salt.loaded.int.module.cmdmod:842 ][ERROR 

> ][10644] retcode: 1 2021-06-25 13:43:32,162 [salt.state       :323 ][ERROR

>  ][10644] {u'pid': 10737, u'retcode': 1, u'stderr': u'curl: (60) Peer\'s

> Certificate issuer is not recognized.\nMore details here:

> http://curl.haxx.se/docs/sslcerts.html\n\ncurl<http://curl.haxx.se/docs/ssl

> certs.html/n/ncurl> performs SSL certificate verification by default, using

> a "bundle"\n of Certificate Authority (CA) public keys (CA certs). If the

> default\n bundle file isn\'t adequate, you can specify an alternate file\n

> using the --cacert option.\nIf this HTTPS server uses a certificate signed

> by a CA represented in\n the bundle, the certificate verification probably

> failed due to a\n problem with the certificate (it might be expired, or the

> name might\n not match the domain name in the URL).\nIf you\'d like to turn

> off curl\'s verification of the certificate, use\n the -k (or --insecure)

> option.\nerror:

> https://ata-oxy-uyuni01.atass.com/pub/res-gpg-pubkey-0182b964.key: import

> read failed(2).', u'stdout': u''} 2021-06-25 13:43:32,590

> [salt.loaded.int.module.cmdmod:836 ][ERROR   ][10644] Command 'rpm' failed

> with return code: 1 2021-06-25 13:43:32,590

> [salt.loaded.int.module.cmdmod:840 ][ERROR   ][10644] stderr: curl: (60)

> Peer's Certificate issuer is not recognized. More details here:

> http://curl.haxx.se/docs/sslcerts.html

>

> curl performs SSL certificate verification by default, using a "bundle"

>  of Certificate Authority (CA) public keys (CA certs). If the default

>  bundle file isn't adequate, you can specify an alternate file

>  using the --cacert option.

> If this HTTPS server uses a certificate signed by a CA represented in

>  the bundle, the certificate verification probably failed due to a

>  problem with the certificate (it might be expired, or the name might

>  not match the domain name in the URL).

> If you'd like to turn off curl's verification of the certificate, use

>  the -k (or --insecure) option.

> error: https://ata-oxy-uyuni01.atass.com/pub/sle12-gpg-pubkey-39db7c82.key:

> import read failed(2). 2021-06-25 13:43:32,591

> [salt.loaded.int.module.cmdmod:842 ][ERROR   ][10644] retcode: 1 2021-06-25

> 13:43:32,591 [salt.state       :323 ][ERROR   ][10644] {u'pid': 10775,

> u'retcode': 1, u'stderr': u'curl: (60) Peer\'s Certificate issuer is not

> recognized.\nMore details here:

> http://curl.haxx.se/docs/sslcerts.html\n\ncurl<http://curl.haxx.se/docs/ssl

> certs.html/n/ncurl> performs SSL certificate verification by default, using

> a "bundle"\n of Certificate Authority (CA) public keys (CA certs). If the

> default\n bundle file isn\'t adequate, you can specify an alternate file\n

> using the --cacert option.\nIf this HTTPS server uses a certificate signed

> by a CA represented in\n the bundle, the certificate verification probably

> failed due to a\n problem with the certificate (it might be expired, or the

> name might\n not match the domain name in the URL).\nIf you\'d like to turn

> off curl\'s verification of the certificate, use\n the -k (or --insecure)

> option.\nerror:

> https://ata-oxy-uyuni01.atass.com/pub/sle12-gpg-pubkey-39db7c82.key: import

> read failed(2).', u'stdout': u''}

>

>

>

>

> -----Original Message-----

> From: Julio Gonzalez <jgonzalez@suse.com>

> Sent: 25 June 2021 11:52

> To: uyuni-users@opensuse.org; users@lists.uyuni-project.org; Simon Avery

> <Simon.Avery@atass-sports.co.uk> Subject: [EXTERNAL EMAIL] Re: Unable to

> restart Uyuni following Uyuni upgrade 2021-06

> On viernes, 25 de junio de 2021 11:28:06 (CEST) Simon Avery wrote:

> > I rolled right back from yesterday to a 2021-05 state, so none of

> > yesterday's attempts affected it - so please disregard any of that.

>

> So the error happened yesterday, and not with the migration you performed

> today after the rollback, right?

>

> I an confirm that I could not reproduce this problem. In my case

> uyuni-build- keys installs without that warning.

> > This error was at the end of `

> > /usr/lib/susemanager/bin/server-migrator.sh`

> >

> > Once rebooted, I then ran `

> > /usr/lib/susemanager/bin/pg-migrate-12-to-13.sh`

> >

> >  which completed normally, and Uyuni started up.

> >

> > S

> >

> > -----Original Message-----

> > From: Julio Gonzalez <jgonzalez@suse.com>

> > Sent: 25 June 2021 10:26

> > To: uyuni-users@opensuse.org; users@lists.uyuni-project.org; Simon

> > Avery <Simon.Avery@atass-sports.co.uk> Subject: [EXTERNAL EMAIL] Re:

> > Unable to restart Uyuni following Uyuni upgrade 2021-06 On viernes, 25

> >

> > de junio de 2021 11:02:20 (CEST) Simon Avery wrote:

> > > At the end of the migration script, it showed this, prompting me to

> > > review scrollback.

> > >

> > > Migration went wrong. Please fix the issues and try again.

> >

> > So my guess is that anyway you went ahead and executed the PostgreSQL

> > migration script, as the log had a warning but not an error?

> >

> > I didn't really see this error on the migration test I did. I will

> > repeate it now, just in case.

> >

> > > The only error I could see was:

> > >

> > > (1396/1527) Installing: uyuni-build-keys-2021.06-3.2.uyuni1.noarch

> > > ........................................................................

> > > ..

> > > .

> > > ........................[done] Additional rpm output:

> > > importing Uyuni build key to rpm keyring... gpg: public key of

> > > ultimately trusted key 8EFD162952047CD0 not found importing the key

> > > from the file /usr/lib/uyuni/uyuni-build-keys.gpg returned an error.

> > > This should not happen. It may not be possible to properly verify

> > > the authenticity of rpm packages from SUSE sources.

> > > The keyring containing the SUSE rpm package signing key can be found

> > > in the root directory of the first CD (DVD) of your SUSE product.

> > > warning: %post(uyuni-build-keys-2021.06-3.2.uyuni1.noarch) scriptlet

> > > failed,

> > >

> > >

> > >

> > >

> > >

> > >

> > >

> > > -----Original Message-----

> > > From: Julio Gonzalez <jgonzalez@suse.com>

> > > Sent: 25 June 2021 09:36

> > > To: uyuni-users@opensuse.org; users@lists.uyuni-project.org

> > > Cc: Simon Avery <Simon.Avery@atass-sports.co.uk>

> > > Subject: [EXTERNAL EMAIL] Re: Unable to restart Uyuni following

> > > Uyuni upgrade 2021-06 On viernes, 25 de junio de 2021 10:30:06

> > > (CEST) Simon

> > >

> > > Avery wrote:

> > > > On retrying with the Major upgrade path, things were much more

> > > > positive (and

> > > > simpler) and the update completed.

> > > >

> > > > One warning about the master GPG key, but otherwise things look

> > > > good

> > > > - and I can see that modules.yaml is now populating in the repo.

> > >

> > > What do you mean? What master GPG key :-?

> > >

> > > > Hopefully that will fix my Centos/Rocky 8.4 issues.

> > > >

> > > > Thanks

> > > >

> > > > -----Original Message-----

> > > > From: Simon Avery

> > > > Sent: 24 June 2021 14:26

> > > > To: 'Julio Gonzalez' <jgonzalez@suse.com>;

> > > > uyuni-users@opensuse.org; users@lists.uyuni-project.org Cc: Joseph

> > > > Cayouette <JCayouette@suse.com>

> > > > Subject: RE: Unable to restart Uyuni following Uyuni upgrade

> > > > 2021-06

> > > >

> > > > Thanks, Julio.

> > > >

> > > > It looks like my mistake was following the minor-upgrade steps,

> > > > then going off piste with manual steps.

> > > >

> > > > I've now rolled back to the starting place, and will have another

> > > > stab at it tomorrow using the link you provided.

> > > >

> > > > S

> > > >

> > > > -----Original Message-----

> > > > From: Julio Gonzalez <jgonzalez@suse.com>

> > > > Sent: 24 June 2021 13:31

> > > > To: uyuni-users@opensuse.org; users@lists.uyuni-project.org

> > > > Cc: Simon Avery <Simon.Avery@atass-sports.co.uk>; Joseph Cayouette

> > > > <JCayouette@suse.com> Subject: [EXTERNAL EMAIL] Re: Unable to

> > > > restart Uyuni following Uyuni upgrade 2021-06 On jueves, 24 de

> > > > junio de 2021

> > > >

> > > > 14:20:41 (CEST) Simon Avery wrote:

> > > > > Hi Folk,

> > > > >

> > > > > Thanks for the new release - but I'm having significant issues

> > > > > (around four hours so far) in updating this and currently have a

> > > > > broken system.

> > > > >

> > > > > Please advise!

> > > > >

> > > > > 'spacewalk-server status' reports all is fine, however the webui

> > > > > times out (At other stages, it was returning a 404 for all urls)

> > > > >

> > > > > Steps I have done:

> > > > >   *   Changed repos and upgraded base OS to Leap 15.3

> > > > >  

> > > > >   *   Upgraded database as per

> > > > >

> > > > > https://www.uyuni-project.org/uyuni-docs/en/uyuni/upgrade/db-mig

> > > > > ra

> > > > > ti

> > > > > on

> > > > > -13.h

> > > > > tml

> > > > >

> > > > >   *   Changed the Uyuni repo.

> > > > >   *   Normal Uyuni upgrade.

> > > >

> > > > Wrong steps.

> > > >

> > > > https://www.uyuni-project.org/doc/2021.06/release-notes-uyuni-serv

> > > > er

> > > > .h

> > > > tml

> > > >

> > > > > Upgrade notes

> > > > > WARNING: Check "Update from previous versions of Uyuni Server"

> > > > > section below

> > > >

> > > > for details, as this release updates the base OS from openSUSE

> > > > Leap

> > > > 15.2 to openSUSE Leap 15.3, and there are special steps required.

> > > > You need at least Uyuni 2020.07 already installed to perform the

> > > > upgrade.

> > > >

> > > > Then if you go to that section...

> > > >

> > > > > Update from previous versions of Uyuni Server

> > > > > WARNING: Make sure you check the documentation this time.

> > > > > Because of the

> > > >

> > > > change from openSUSE Leap 15.2 to openSUSE Leap 15.3, some special

> > > > steps are required! WARNING: This applies not only when updating

> > > > from 2021.05, but also when updating from any version after

> > > > 2020.07 (included). Updating from

> > > > 2020.06 and older is not supported anymore.

> > > >

> > > > > See the "Upgrade Guide" for detailed instructions on how to upgrade.

> > > > > You

> > > >

> > > > will need to follow the "Upgrade the Server" > "Server - Major

> > > > Upgrade"

> > > > section.

> > > >

> > > > > All connected clients will continue to run and are manageable

> > > > > unchanged

> > > >

> > > > And the doc:

> > > > https://www.uyuni-project.org/uyuni-docs/en/uyuni/upgrade/server-m

> > > > aj

> > > > or

> > > > -upgr

> > > > ade-uyuni.html

> > > >

> > > > *Maybe* you can fix the issues by doing the procedure again,

> > > > EXCEPT the call to /usr/lib/susemanager/bin/pg-migrate-12-to-13.sh

> > > > (step 3) as you already did that.

> > > >

> > > > That should work, but if you have a backup, it's better if you

> > > > restore it and start the upgrade again.

> > > >

> > > > TBH, I wonder if we should not just remove the "Upgrade the Database"

> > > > and integrate it with the the "Server - Major Upgrade" section.

> > > >

> > > > Joseph any opinion?

> > > >

> > > > > All package issues seemingly resolved, apart from this:

> > > > >

> > > > > Problem: the to be installed patch:SUSE-2020-3767-1.noarch

> > > > > conflicts with 'apache-commons-el < 1.0-3.3.1' provided by the

> > > > > installed apache-commons-el-1.0-bp153.2.24.noarch Solution 1:

> > > > > Following actions

> > > > >

> > > > > will be done:

> > > > >   deinstallation of apache-commons-el-1.0-bp153.2.24.noarch

> > > > >   deinstallation of spacewalk-java-4.2.23-1.7.uyuni1.noarch

> > > > >   deinstallation of spacewalk-common-4.2.3-1.6.uyuni1.noarch

> > > > >   deinstallation of spacewalk-postgresql-4.2.3-1.6.uyuni1.noarch

> > > > >   deinstallation of patterns-uyuni_server-2021.06-2.3.uyuni1.x86_64

> > > > >   deinstallation of susemanager-4.2.19-1.2.uyuni1.x86_64

> > > > >   deinstallation of

> > > > >   supportutils-plugin-susemanager-4.2.2-2.4.uyuni1.noarch

> > > > >   deinstallation of

> > > > >

> > > > > uyuni-cluster-provider-caasp-4.2.3-1.4.uyuni1.noarch

> > > > > Solution 2: do not install patch:SUSE-2020-3767-1.noarch

> > > > >

> > > > > (Did try #1, then reinstalling uyuni-server patterns, but it

> > > > > just reappears. ISTR there was a breaking issue with

> > > > > apache-commons-el in a previous update)

> > > > >

> > > > > The main issue appears to be related to this, in catalina's logs:

> > > > > (The dir is present and has an index.jsp file and dirs.)

> > > > >

> > > > > 24-Jun-2021 11:24:31.328 INFO [main]

> > > > > org.apache.catalina.core.StandardEngine.startInternal Starting

> > > > > Servlet

> > > > > engine: [Apache Tomcat/9.0.36] 24-Jun-2021 11:24:31.357 INFO

> > > > > [main] org.apache.catalina.startup.HostConfig.deployDirectory

> > > > > Deploying web application directory [/srv/tomcat/webapps/rhn]

> > > > > 24-Jun-2021

> > > > > 11:24:33.950 SEVERE [main]

> > > > > org.apache.catalina.startup.HostConfig.deployDirectory Error

> > > > > deploying web application directory [/srv/tomcat/webapps/rhn]

> > > > > java.lang.IllegalStateException: Error starting child

> > > > >

> > > > >                 at

> > > > >

> > > > > org.apache.catalina.core.ContainerBase.addChildInternal(Containe

> > > > > rB

> > > > > as

> > > > > e.java

> > > > >

> > > > > 720) at

> > > > > org.apache.catalina.core.ContainerBase.addChild(ContainerBase.ja

> > > > > va

> > > > >

> > > > > :6

> > > > >

> > > > > 90

> > > > > ) at

> > > > > org.apache.catalina.core.StandardHost.addChild(StandardHost.java

> > > > >

> > > > > :7

> > > > >

> > > > > 05

> > > > > )

> > > > > at

> > > > > org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.

> > > > > ja

> > > > > va

> > > > >

> > > > > :1132

> > > > >

> > > > > ) at

> > > > > org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostC

> > > > > on

> > > > > fi

> > > > > g.java

> > > > >

> > > > > 1865) at

> > > > > java.base/java.util.concurrent.Executors$RunnableAdapter.call(Ex

> > > > > ec

> > > > > ut

> > > > > or

> > > > > s.jav

> > > > > a:515) at

> > > > > java.base/java.util.concurrent.FutureTask.run(FutureTask.java:26

> > > > > 4)

> > > > > at

> > > > > org.apache.tomcat.util.threads.InlineExecutorService.execute(Inl

> > > > > in

> > > > > eE

> > > > > xe

> > > > > cutor

> > > > > Service.java:75) at

> > > > > java.base/java.util.concurrent.AbstractExecutorService.submit(Ab

> > > > > st

> > > > > ra

> > > > > ct

> > > > > Execu

> > > > > torService.java:118) at

> > > > > org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.

> > > > > ja

> > > > > va:10

> > > > > 44) at

> > > > > org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:

> > > > > 42

> > > > > 9)

> > > > > at

> > > > > org.apache.catalina.startup.HostConfig.start(HostConfig.java:157

> > > > > 5)

> > > > > at

> > > > > org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.jav

> > > > > a:

> > > > > 309)

> > > > > at

> > > > > org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(Lifecy

> > > > > cl

> > > > > eB

> > > > > as

> > > > > e.jav

> > > > > a:123) at

> > > > > org.apache.catalina.util.LifecycleBase.setStateInternal(Lifecycl

> > > > > eB

> > > > > as

> > > > > e.java

> > > > >

> > > > > 423) at

> > > > > org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.ja

> > > > > va

> > > > >

> > > > > :3

> > > > >

> > > > > 66

> > > > > ) at

> > > > > org.apache.catalina.core.ContainerBase.startInternal(ContainerBa

> > > > > se

> > > > > .j

> > > > > av

> > > > > a:936

> > > > > ) at

> > > > > org.apache.catalina.core.StandardHost.startInternal(StandardHost.jav

> > > > > a:

> > > > > 841) at

> > > > > org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:

> > > > > 18

> > > > > 3)

> > > > > at

> > > > > org.apache.catalina.core.ContainerBase$StartChild.call(Container

> > > > > Ba

> > > > > se

> > > > > .j

> > > > > ava:1

> > > > > 384) at

> > > > > org.apache.catalina.core.ContainerBase$StartChild.call(Container

> > > > > Ba

> > > > > se

> > > > > .j

> > > > > ava:1

> > > > > 374) at

> > > > > java.base/java.util.concurrent.FutureTask.run(FutureTask.java:26

> > > > > 4)

> > > > > at

> > > > > org.apache.tomcat.util.threads.InlineExecutorService.execute(Inl

> > > > > in

> > > > > eE

> > > > > xe

> > > > > cutor

> > > > > Service.java:75) at

> > > > > java.base/java.util.concurrent.AbstractExecutorService.submit(Ab

> > > > > st

> > > > > ra

> > > > > ct

> > > > > Execu

> > > > > torService.java:140) at

> > > > > org.apache.catalina.core.ContainerBase.startInternal(ContainerBa

> > > > > se

> > > > > .j

> > > > > av

> > > > > a:909

> > > > > ) at

> > > > > org.apache.catalina.core.StandardEngine.startInternal(StandardEn

> > > > > gi

> > > > > ne

> > > > > .j

> > > > > ava:2

> > > > > 62) at

> > > > > org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:

> > > > > 18

> > > > > 3)

> > > > > at

> > > > > org.apache.catalina.core.StandardService.startInternal(StandardS

> > > > > er

> > > > > vi

> > > > > ce

> > > > > .java

> > > > >

> > > > > :421) at

> > > > >

> > > > > org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:

> > > > > 18

> > > > > 3)

> > > > > at

> > > > > org.apache.catalina.core.StandardServer.startInternal(StandardSe

> > > > > rv

> > > > > er

> > > > > .j

> > > > > ava:9

> > > > > 30) at

> > > > > org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:

> > > > > 18

> > > > > 3) at

> > > > > org.apache.catalina.startup.Catalina.start(Catalina.java:633) at

> > > > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(

> > > > > Na

> > > > > ti

> > > > > ve

> > > > > Method) at

> > > > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(N

> > > > > at

> > > > > iv

> > > > > eM

> > > > > ethod

> > > > > AccessorImpl.java:62) at

> > > > > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invo

> > > > > ke

> > > > > (D

> > > > > el

> > > > > egati

> > > > > ngMethodAccessorImpl.java:43) at

> > > > > java.base/java.lang.reflect.Method.invoke(Method.java:566) at

> > > > > org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:343)

> > > > > at

> > > > > org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:474)

> > > > > Caused

> > > > > by:

> > > > > org.apache.catalina.LifecycleException: Failed to start

> > > > > component

> > > > > [StandardEngine[Catalina].StandardHost[localhost].StandardContex

> > > > > t[

> > > > > /r

> > > > > hn

> > > > > ]] at

> > > > > org.apache.catalina.util.LifecycleBase.handleSubClassException(L

> > > > > if

> > > > > ec

> > > > > yc

> > > > > leBas

> > > > > e.java:440) at

> > > > > org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:

> > > > > 19

> > > > > 8)

> > > > > at

> > > > > org.apache.catalina.core.ContainerBase.addChildInternal(Containe

> > > > > rB

> > > > > as

> > > > > e.java

> > > > >

> > > > > 717) ... 37 more

> > > > >

> > > > >         Caused by: java.lang.NullPointerException

> > > > >        

> > > > >                 at

> > > > >

> > > > > org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJ

> > > > > ar

> > > > > Sc

> > > > > an

> > > > > ner.j

> > > > > ava:382) at

> > > > > org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarS

> > > > > ca

> > > > > nn

> > > > > er

> > > > > .java

> > > > >

> > > > > :195) at

> > > > >

> > > > > org.apache.catalina.startup.ContextConfig.processJarsForWebFragm

> > > > > en

> > > > > ts

> > > > > (C

> > > > > ontex

> > > > > tConfig.java:1971) at

> > > > > org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.

> > > > > ja

> > > > > va

> > > > >

> > > > > :1129

> > > > >

> > > > > ) at

> > > > > org.apache.catalina.startup.ContextConfig.configureStart(Context

> > > > > Co

> > > > > nf

> > > > > ig

> > > > > .java

> > > > >

> > > > > :775) at

> > > > >

> > > > > org.apache.catalina.startup.ContextConfig.lifecycleEvent(Context

> > > > > Co

> > > > > nf

> > > > > ig

> > > > > .java

> > > > >

> > > > > :301) at

> > > > >

> > > > > org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(Lifecy

> > > > > cl

> > > > > eB

> > > > > as

> > > > > e.jav

> > > > > a:123) at

> > > > > org.apache.catalina.core.StandardContext.startInternal(StandardC

> > > > > on

> > > > > te

> > > > > xt

> > > > > .java

> > > > >

> > > > > :5044) at

> > > > >

> > > > > org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:

> > > > > 18

> > > > > 3)

> > > > > ... 38 more

> > > > > 24-Jun-2021 11:24:33.958 INFO [main]

> > > > > org.apache.catalina.startup.HostConfig.deployDirectory

> > > > > Deployment of web application directory

> > > > > [/srv/tomcat/webapps/rhn] has finished in [2,601] ms

> > > > > 24-Jun-2021 11:24:33.990 INFO [main]

> > > > > org.apache.coyote.AbstractProtocol.start Starting

> > > > > ProtocolHandler ["ajp-nio-127.0.0.1-8009"] 24-Jun-2021

> > > > > 11:24:34.297 INFO [main]

> > > > > org.apache.coyote.AbstractProtocol.start Starting

> > > > > ProtocolHandler ["ajp-nio-0:0:0:0:0:0:0:1-8009"] 24-Jun-2021

> > > > > 11:24:34.509 INFO [main]

> > > > > org.apache.coyote.AbstractProtocol.start Starting

> > > > > ProtocolHandler ["http-nio-127.0.0.1-8080"] 24-Jun-2021

> > > > > 11:24:34.564 INFO [main]

> > > > > org.apache.catalina.startup.Catalina.start Server startup in

> > > > > [3,354] milliseconds

> > > > >

> > > > >

> > > > >

> > > > > Simon Avery

> > > > > Linux Systems Administrator

> > > >

> > > > --

> > > > Julio González Gil

> > > > Release Engineer, SUSE Manager and Uyuni jgonzalez@suse.com

> > >

> > > --

> > > Julio González Gil

> > > Release Engineer, SUSE Manager and Uyuni jgonzalez@suse.com

> >

> > --

> > Julio González Gil

> > Release Engineer, SUSE Manager and Uyuni jgonzalez@suse.com

>

> --

> Julio González Gil

> Release Engineer, SUSE Manager and Uyuni jgonzalez@suse.com



--

Julio González Gil

Release Engineer, SUSE Manager and Uyuni

jgonzalez@suse.com