This job view page is being replaced by Spyglass soon. Check out the new job view.
PRjichenjc: cherry pick support mounting btrfs volumes by the cinder-csi-plugin (#1941) to 1.24
Resultsuccess
Tests 0 failed / 0 succeeded
Started2022-09-05 06:12
Elapsed39m26s
Revision
Refs 1944
uploadercrier
uploadercrier

No Test Failures!


Error lines from build-log.txt

... skipping 303 lines ...
PLAY [all] *********************************************************************

TASK [Gathering Facts] *********************************************************
ok: [34.150.188.9]

TASK [install-golang : Check if go has satisfied requirement] ******************
fatal: [34.150.188.9]: FAILED! => {
    "changed": true,
    "cmd": "/usr/local/go/bin/go version |grep -Eo '([0-9]+\\.)+[0-9]+'",
    "delta": "0:00:00.003350",
    "end": "2022-09-05 06:17:41.906163",
    "rc": 1,
    "start": "2022-09-05 06:17:41.902813"
... skipping 2929 lines ...
Enabling module setenvif.
Enabling module filter.
Enabling module deflate.
Enabling module status.
Enabling module reqtimeout.
Enabling conf charset.
Enabling conf localized-error-pages.

Enabling conf other-vhosts-access-log.
Enabling conf security.
Enabling conf serve-cgi-bin.
Enabling site 000-default.
info: Executing deferred 'a2enconf javascript-common' for package javascript-common
Enabling conf javascript-common.
... skipping 145 lines ...
Using python 3.8 to install setuptools!=24.0.0,!=34.0.0,!=34.0.1,!=34.0.2,!=34.0.3,!=34.1.0,!=34.1.1,!=34.2.0,!=34.3.0,!=34.3.1,!=34.3.2,!=36.2.0,!=48.0.0,!=49.0.0
+ inc/python:pip_install:193               :   sudo -H LC_ALL=en_US.UTF-8 SETUPTOOLS_USE_DISTUTILS=stdlib http_proxy= https_proxy= no_proxy= PIP_FIND_LINKS= SETUPTOOLS_SYS_PATH_TECHNIQUE=rewrite python3.8 -m pip install -c /opt/stack/requirements/upper-constraints.txt --ignore-installed 'setuptools!=24.0.0,!=34.0.0,!=34.0.1,!=34.0.2,!=34.0.3,!=34.1.0,!=34.1.1,!=34.2.0,!=34.3.0,!=34.3.1,!=34.3.2,!=36.2.0,!=48.0.0,!=49.0.0'
Collecting setuptools!=24.0.0,!=34.0.0,!=34.0.1,!=34.0.2,!=34.0.3,!=34.1.0,!=34.1.1,!=34.2.0,!=34.3.0,!=34.3.1,!=34.3.2,!=36.2.0,!=48.0.0,!=49.0.0
  Downloading setuptools-57.4.0-py3-none-any.whl (819 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 819.0/819.0 kB 57.1 MB/s eta 0:00:00
Installing collected packages: setuptools
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
launchpadlib 1.10.13 requires testresources, which is not installed.
Successfully installed setuptools-57.4.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
+ inc/python:pip_install:200               :   result=0
+ inc/python:pip_install:202               :   time_stop pip_install
+ functions-common:time_stop:2342          :   local name
... skipping 137 lines ...
  Building wheel for pyperclip (setup.py): started
  Building wheel for pyperclip (setup.py): finished with status 'done'
  Created wheel for pyperclip: filename=pyperclip-1.8.2-py3-none-any.whl size=11136 sha256=66ff36ba724194435d1aae2a3678f15ba20e7176f371aaf25f57b28eb5c4b1ec
  Stored in directory: /root/.cache/pip/wheels/7f/1a/65/84ff8c386bec21fca6d220ea1f5498a0367883a78dd5ba6122
Successfully built future PrettyTable pyperclip
Installing collected packages: wcwidth, voluptuous, pyperclip, PrettyTable, extras, six, PyYAML, pyparsing, pbr, future, colorama, autopage, attrs, stevedore, cmd2, cliff, fixtures, testtools, python-subunit, stestr, os-testr
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
launchpadlib 1.10.13 requires testresources, which is not installed.
Successfully installed PrettyTable-0.7.2 PyYAML-5.4.1 attrs-21.2.0 autopage-0.4.0 cliff-3.9.0 cmd2-2.1.2 colorama-0.4.4 extras-1.0.0 fixtures-3.0.0 future-0.18.2 os-testr-2.0.1 pbr-5.6.0 pyparsing-2.4.7 pyperclip-1.8.2 python-subunit-1.4.0 six-1.16.0 stestr-3.2.0 stevedore-3.4.1 testtools-2.5.0 voluptuous-0.12.1 wcwidth-0.2.5
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
+ inc/python:pip_install:200               :   result=0
+ inc/python:pip_install:202               :   time_stop pip_install
+ functions-common:time_stop:2342          :   local name
... skipping 110 lines ...
Collecting pyperclip===1.8.2
  Downloading pyperclip-1.8.2.tar.gz (20 kB)
Collecting colorama===0.4.4
  Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB)
Building wheels for collected packages: prettytable, pyperclip, openstack-requirements
  Building wheel for prettytable (setup.py): started
  Building wheel for prettytable (setup.py): finished with status 'error'
  ERROR: Command errored out with exit status 1:
   command: /opt/stack/requirements/.venv/bin/python3 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-kxvcatb3/prettytable/setup.py'"'"'; __file__='"'"'/tmp/pip-install-kxvcatb3/prettytable/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-4i_d8ot_
       cwd: /tmp/pip-install-kxvcatb3/prettytable/
  Complete output (6 lines):
  usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
     or: setup.py --help [cmd1 cmd2 ...]
     or: setup.py --help-commands
     or: setup.py cmd --help
  
  error: invalid command 'bdist_wheel'
  ----------------------------------------
  ERROR: Failed building wheel for prettytable
  Running setup.py clean for prettytable
  Building wheel for pyperclip (setup.py): started
  Building wheel for pyperclip (setup.py): finished with status 'error'
  ERROR: Command errored out with exit status 1:
   command: /opt/stack/requirements/.venv/bin/python3 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-kxvcatb3/pyperclip/setup.py'"'"'; __file__='"'"'/tmp/pip-install-kxvcatb3/pyperclip/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-r0_8ttgk
       cwd: /tmp/pip-install-kxvcatb3/pyperclip/
  Complete output (6 lines):
  usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
     or: setup.py --help [cmd1 cmd2 ...]
     or: setup.py --help-commands
     or: setup.py cmd --help
  
  error: invalid command 'bdist_wheel'
  ----------------------------------------
  ERROR: Failed building wheel for pyperclip
  Running setup.py clean for pyperclip
  Building wheel for openstack-requirements (setup.py): started
  Building wheel for openstack-requirements (setup.py): finished with status 'error'
  ERROR: Command errored out with exit status 1:
   command: /opt/stack/requirements/.venv/bin/python3 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-cqbs6der/setup.py'"'"'; __file__='"'"'/tmp/pip-req-build-cqbs6der/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-5l6pkpab
       cwd: /tmp/pip-req-build-cqbs6der/
  Complete output (6 lines):
  usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
     or: setup.py --help [cmd1 cmd2 ...]
     or: setup.py --help-commands
     or: setup.py cmd --help
  
  error: invalid command 'bdist_wheel'
  ----------------------------------------
  ERROR: Failed building wheel for openstack-requirements
  Running setup.py clean for openstack-requirements
Failed to build prettytable pyperclip openstack-requirements
Installing collected packages: certifi, pbr, attrs, extras, testtools, six, fixtures, wcwidth, prettytable, urllib3, PyYAML, charset-normalizer, idna, requests, autopage, pyperclip, colorama, cmd2, stevedore, pyparsing, cliff, packaging, Parsley, beagle, openstack-requirements
    Running setup.py install for prettytable: started
    Running setup.py install for prettytable: finished with status 'done'
    Running setup.py install for pyperclip: started
    Running setup.py install for pyperclip: finished with status 'done'
    Running setup.py install for openstack-requirements: started
... skipping 690 lines ...
update-alternatives: using /var/lib/mecab/dic/ipadic-utf8 to provide /var/lib/mecab/dic/debian (mecab-dictionary) in auto mode
Setting up libhtml-parser-perl (3.72-5) ...
Setting up libhttp-message-perl (6.22-1) ...
Setting up mysql-server-8.0 (8.0.30-0ubuntu0.20.04.2) ...
update-alternatives: using /etc/mysql/mysql.cnf to provide /etc/mysql/my.cnf (my.cnf) in auto mode
Renaming removed key_buffer and myisam-recover options (if present)
mysqld will log errors to /var/log/mysql/error.log

mysqld is running as pid 31014
Created symlink /etc/systemd/system/multi-user.target.wants/mysql.service → /lib/systemd/system/mysql.service.
Setting up libcgi-pm-perl (4.46-1) ...
Setting up libhtml-template-perl (2.97-1) ...
Setting up mysql-server (8.0.30-0ubuntu0.20.04.2) ...
Setting up libcgi-fast-perl (1:2.15-1) ...
... skipping 110 lines ...
Fetched 59.0 kB in 0s (954 kB/s)
Selecting previously unselected package radvd.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 101911 files and directories currently installed.)
Preparing to unpack .../radvd_1%3a2.17-2_amd64.deb ...
Unpacking radvd (1:2.17-2) ...
Setting up radvd (1:2.17-2) ...
Job for radvd.service failed because the control process exited with error code.

See "systemctl status radvd.service" and "journalctl -xe" for details.
invoke-rc.d: initscript radvd, action "start" failed.

● radvd.service - Router advertisement daemon for IPv6
     Loaded: loaded (]8;;file://devstack/lib/systemd/system/radvd.service/lib/systemd/system/radvd.service]8;;; disabled; vendor preset: enabled)
     Active: failed (Result: exit-code) since Mon 2022-09-05 06:21:45 UTC; 7ms ago

       Docs: ]8;;man:radvd(8)man:radvd(8)]8;;
    Process: 32169 ExecStartPre=/usr/sbin/radvd --logmethod stderr_clean --configtest (code=exited, status=1/FAILURE)

Sep 05 06:21:45 devstack systemd[1]: Starting Router advertisement daemon for IPv6...
Sep 05 06:21:45 devstack radvd[32169]: can't open /etc/radvd.conf: No such file or directory
Sep 05 06:21:45 devstack radvd[32169]: Insecure file permissions, but continuing anyway
Sep 05 06:21:45 devstack radvd[32169]: exiting, failed to read config file

Sep 05 06:21:45 devstack systemd[1]: radvd.service: Control process exited, code=exited, status=1/FAILURE
Sep 05 06:21:45 devstack systemd[1]: radvd.service: Failed with result 'exit-code'.

Sep 05 06:21:45 devstack systemd[1]: Failed to start Router advertisement daemon for IPv6.

Processing triggers for man-db (2.9.1-1) ...
Processing triggers for systemd (245.4-4ubuntu3.17) ...
+ functions-common:apt_get:1137            :   result=0
+ functions-common:apt_get:1140            :   time_stop apt-get
+ functions-common:time_stop:2342          :   local name
+ functions-common:time_stop:2343          :   local end_time
... skipping 646 lines ...
  Building wheel for wrapt (setup.py): finished with status 'done'
  Created wheel for wrapt: filename=wrapt-1.12.1-cp38-cp38-linux_x86_64.whl size=78580 sha256=3b904e5248c739e4b0f59971a0af53c9838c10d91a83353c1ae6e66fbbe3b366
  Stored in directory: /root/.cache/pip/wheels/5f/fd/9e/b6cf5890494cb8ef0b5eaff72e5d55a70fb56316007d6dfe73
Successfully built wrapt
Installing collected packages: wrapt, wcwidth, pytz, pyperclip, PrettyTable, netifaces, netaddr, msgpack, iso8601, certifi, urllib3, six, PyYAML, pyparsing, pbr, idna, colorama, charset-normalizer, autopage, attrs, stevedore, requests, packaging, oslo.i18n, os-service-types, debtcollector, cmd2, oslo.utils, keystoneauth1, cliff, oslo.serialization, python-barbicanclient
  Running setup.py develop for python-barbicanclient
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
launchpadlib 1.10.13 requires testresources, which is not installed.
Successfully installed PrettyTable-0.7.2 PyYAML-5.4.1 attrs-21.2.0 autopage-0.4.0 certifi-2021.5.30 charset-normalizer-2.0.4 cliff-3.9.0 cmd2-2.1.2 colorama-0.4.4 debtcollector-2.3.0 idna-3.2 iso8601-0.1.16 keystoneauth1-4.4.0 msgpack-1.0.2 netaddr-0.8.0 netifaces-0.11.0 os-service-types-1.7.0 oslo.i18n-5.1.0 oslo.serialization-4.2.0 oslo.utils-4.10.2 packaging-21.0 pbr-5.6.0 pyparsing-2.4.7 pyperclip-1.8.2 python-barbicanclient-5.3.1.dev4 pytz-2021.1 requests-2.26.0 six-1.16.0 stevedore-3.4.1 urllib3-1.26.6 wcwidth-0.2.5 wrapt-1.12.1
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
+ inc/python:pip_install:200               :   result=0
+ inc/python:pip_install:202               :   time_stop pip_install
+ functions-common:time_stop:2342          :   local name
... skipping 43 lines ...
  Building wheel for etcd3 (setup.py): started
  Building wheel for etcd3 (setup.py): finished with status 'done'
  Created wheel for etcd3: filename=etcd3-0.12.0-py2.py3-none-any.whl size=42886 sha256=b01db0db8a3220d60afbcc9f55b8aa3d56b32c8bed3f3ba116c28740c07bf757
  Stored in directory: /root/.cache/pip/wheels/01/af/36/31f9f42a3b49c1998e9bdaec85a28e4c9a24c0d8a4f81293d7
Successfully built etcd3
Installing collected packages: six, tenacity, protobuf, grpcio, etcd3
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
launchpadlib 1.10.13 requires testresources, which is not installed.
Successfully installed etcd3-0.12.0 grpcio-1.39.0 protobuf-3.17.3 six-1.16.0 tenacity-6.3.1
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
+ inc/python:pip_install:200               :   result=0
+ inc/python:pip_install:202               :   time_stop pip_install
+ functions-common:time_stop:2342          :   local name
... skipping 30 lines ...
  Using cached urllib3-1.26.6-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.5.30-py2.py3-none-any.whl (145 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.4-py3-none-any.whl (36 kB)
Installing collected packages: certifi, urllib3, six, pbr, idna, charset-normalizer, requests, futurist, etcd3gw
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
launchpadlib 1.10.13 requires testresources, which is not installed.
Successfully installed certifi-2021.5.30 charset-normalizer-2.0.4 etcd3gw-1.0.0 futurist-2.4.0 idna-3.2 pbr-5.6.0 requests-2.26.0 six-1.16.0 urllib3-1.26.6
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
+ inc/python:pip_install:200               :   result=0
+ inc/python:pip_install:202               :   time_stop pip_install
+ functions-common:time_stop:2342          :   local name
... skipping 302 lines ...
  Building wheel for pyinotify (setup.py): started
  Building wheel for pyinotify (setup.py): finished with status 'done'
  Created wheel for pyinotify: filename=pyinotify-0.9.6-py3-none-any.whl size=25350 sha256=f930e9f0abeba7f806e1041d77baef7b39ced0e49ff74bc6b9e2eaff7a1088ec
  Stored in directory: /root/.cache/pip/wheels/9d/a0/4b/1a80814e4ad0b035c07831ea1b06b691046198492bbc5769b6
Successfully built pyinotify
Installing collected packages: wrapt, rfc3986, pytz, pyinotify, netifaces, netaddr, msgpack, iso8601, certifi, WebOb, urllib3, six, PyYAML, pyparsing, pbr, idna, decorator, charset-normalizer, stevedore, requests, python-dateutil, packaging, oslo.i18n, os-service-types, debtcollector, oslo.utils, oslo.context, oslo.config, keystoneauth1, dogpile.cache, oslo.serialization, python-keystoneclient, pycadf, oslo.log, oslo.cache, keystonemiddleware
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
launchpadlib 1.10.13 requires testresources, which is not installed.
Successfully installed PyYAML-5.4.1 WebOb-1.8.7 certifi-2021.5.30 charset-normalizer-2.0.4 debtcollector-2.3.0 decorator-5.0.9 dogpile.cache-1.1.5 idna-3.2 iso8601-0.1.16 keystoneauth1-4.4.0 keystonemiddleware-9.3.0 msgpack-1.0.2 netaddr-0.8.0 netifaces-0.11.0 os-service-types-1.7.0 oslo.cache-2.8.2 oslo.config-8.7.1 oslo.context-3.3.2 oslo.i18n-5.1.0 oslo.log-4.6.0 oslo.serialization-4.2.0 oslo.utils-4.10.2 packaging-21.0 pbr-5.6.0 pycadf-3.1.1 pyinotify-0.9.6 pyparsing-2.4.7 python-dateutil-2.8.2 python-keystoneclient-4.3.0 pytz-2021.1 requests-2.26.0 rfc3986-1.5.0 six-1.16.0 stevedore-3.4.1 urllib3-1.26.6 wrapt-1.12.1
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
+ inc/python:pip_install:200               :   result=0
+ inc/python:pip_install:202               :   time_stop pip_install
+ functions-common:time_stop:2342          :   local name
... skipping 28 lines ...
+ inc/python:pip_install:193               :   sudo -H LC_ALL=en_US.UTF-8 SETUPTOOLS_USE_DISTUTILS=stdlib http_proxy= https_proxy= no_proxy= PIP_FIND_LINKS= SETUPTOOLS_SYS_PATH_TECHNIQUE=rewrite python3.8 -m pip install -c /opt/stack/requirements/upper-constraints.txt --ignore-installed python-memcached
Collecting python-memcached
  Downloading python_memcached-1.59-py2.py3-none-any.whl (16 kB)
Collecting six>=1.4.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: six, python-memcached
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
launchpadlib 1.10.13 requires testresources, which is not installed.
Successfully installed python-memcached-1.59 six-1.16.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
+ inc/python:pip_install:200               :   result=0
+ inc/python:pip_install:202               :   time_stop pip_install
+ functions-common:time_stop:2342          :   local name
... skipping 445 lines ...
+ functions:setup_systemd_logging:695      :   local pidstr=
+ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
+ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
+ functions:setup_systemd_logging:702      :   iniset /etc/keystone/keystone.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
+ functions:setup_systemd_logging:704      :   iniset /etc/keystone/keystone.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:705      :   iniset /etc/keystone/keystone.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:706      :   iniset /etc/keystone/keystone.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
+ lib/keystone:configure_keystone:257      :   iniset /etc/keystone/keystone.conf DEFAULT debug True
+ lib/keystone:configure_keystone:259      :   '[' uwsgi == mod_wsgi ']'
+ lib/keystone:configure_keystone:263      :   write_uwsgi_config /etc/keystone/keystone-uwsgi-public.ini /usr/local/bin/keystone-wsgi-public /identity
+ lib/apache:write_uwsgi_config:254        :   local file=/etc/keystone/keystone-uwsgi-public.ini
+ lib/apache:write_uwsgi_config:255        :   local wsgi=/usr/local/bin/keystone-wsgi-public
+ lib/apache:write_uwsgi_config:256        :   local url=/identity
... skipping 775 lines ...
+ functions:setup_systemd_logging:695      :   local pidstr=
+ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
+ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
+ functions:setup_systemd_logging:702      :   iniset /etc/glance/glance-api.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
+ functions:setup_systemd_logging:704      :   iniset /etc/glance/glance-api.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:705      :   iniset /etc/glance/glance-api.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:706      :   iniset /etc/glance/glance-api.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
+ lib/glance:configure_glance:397          :   cp -p /opt/stack/glance/etc/glance-api-paste.ini /etc/glance/glance-api-paste.ini
+ lib/glance:configure_glance:400          :   iniset /etc/glance/glance-cache.conf DEFAULT debug True
+ lib/glance:configure_glance:401          :   iniset /etc/glance/glance-cache.conf DEFAULT use_syslog False
+ lib/glance:configure_glance:402          :   iniset /etc/glance/glance-cache.conf DEFAULT image_cache_dir /opt/stack/data/glance/cache/
+ lib/glance:configure_glance:403          :   iniset /etc/glance/glance-cache.conf DEFAULT auth_url http://10.0.2.15/identity
+ lib/glance:configure_glance:404          :   iniset /etc/glance/glance-cache.conf DEFAULT admin_tenant_name service
... skipping 825 lines ...
+ functions:setup_systemd_logging:695      :   local pidstr=
+ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
+ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
+ functions:setup_systemd_logging:702      :   iniset /etc/cinder/cinder.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
+ functions:setup_systemd_logging:704      :   iniset /etc/cinder/cinder.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:705      :   iniset /etc/cinder/cinder.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:706      :   iniset /etc/cinder/cinder.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
+ lib/cinder:configure_cinder:351          :   write_uwsgi_config /etc/cinder/cinder-api-uwsgi.ini /usr/local/bin/cinder-wsgi /volume
+ lib/apache:write_uwsgi_config:254        :   local file=/etc/cinder/cinder-api-uwsgi.ini
+ lib/apache:write_uwsgi_config:255        :   local wsgi=/usr/local/bin/cinder-wsgi
+ lib/apache:write_uwsgi_config:256        :   local url=/volume
+ lib/apache:write_uwsgi_config:257        :   local http=
+ lib/apache:write_uwsgi_config:258        :   local name=
... skipping 2053 lines ...
+ functions:setup_systemd_logging:695      :   local pidstr=
+ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
+ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
+ functions:setup_systemd_logging:702      :   iniset /etc/nova/nova.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
+ functions:setup_systemd_logging:704      :   iniset /etc/nova/nova.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:705      :   iniset /etc/nova/nova.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:706      :   iniset /etc/nova/nova.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
+ lib/nova:create_nova_conf:497            :   iniset /etc/nova/nova.conf upgrade_levels compute auto
+ lib/nova:create_nova_conf:499            :   write_uwsgi_config /etc/nova/nova-api-uwsgi.ini /usr/local/bin/nova-api-wsgi /compute
+ lib/apache:write_uwsgi_config:254        :   local file=/etc/nova/nova-api-uwsgi.ini
+ lib/apache:write_uwsgi_config:255        :   local wsgi=/usr/local/bin/nova-api-wsgi
+ lib/apache:write_uwsgi_config:256        :   local url=/compute
+ lib/apache:write_uwsgi_config:257        :   local http=
... skipping 194 lines ...
+ functions:setup_systemd_logging:695      :   local pidstr=
+ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
+ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
+ functions:setup_systemd_logging:702      :   iniset /etc/nova/nova_cell1.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
+ functions:setup_systemd_logging:704      :   iniset /etc/nova/nova_cell1.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:705      :   iniset /etc/nova/nova_cell1.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:706      :   iniset /etc/nova/nova_cell1.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
+ lib/nova:create_nova_conf:563            :   [[ superconductor == \s\i\n\g\l\e\c\o\n\d\u\c\t\o\r ]]
++ lib/nova:create_nova_conf:566            :   seq 1 1
+ lib/nova:create_nova_conf:566            :   for i in $(seq 1 $NOVA_NUM_CELLS)
+ lib/nova:create_nova_conf:567            :   local conf
+ lib/nova:create_nova_conf:568            :   local offset
++ lib/nova:create_nova_conf:569            :   conductor_conf 1
... skipping 649 lines ...
+ functions:setup_systemd_logging:695      :   local pidstr=
+ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
+ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
+ functions:setup_systemd_logging:702      :   iniset /etc/placement/placement.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
+ functions:setup_systemd_logging:704      :   iniset /etc/placement/placement.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:705      :   iniset /etc/placement/placement.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:706      :   iniset /etc/placement/placement.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
+ lib/placement:configure_placement:109    :   [[ uwsgi == \u\w\s\g\i ]]
+ lib/placement:configure_placement:110    :   write_uwsgi_config /etc/placement/placement-uwsgi.ini /usr/local/bin/placement-api /placement
+ lib/apache:write_uwsgi_config:254        :   local file=/etc/placement/placement-uwsgi.ini
+ lib/apache:write_uwsgi_config:255        :   local wsgi=/usr/local/bin/placement-api
+ lib/apache:write_uwsgi_config:256        :   local url=/placement
+ lib/apache:write_uwsgi_config:257        :   local http=
... skipping 2683 lines ...
+ functions-common:is_opensuse:476         :   [[ -z Ubuntu ]]
+ functions-common:is_opensuse:480         :   [[ Ubuntu =~ (openSUSE) ]]
+ functions-common:is_suse:469             :   is_suse_linux_enterprise
+ functions-common:is_suse_linux_enterprise:488 :   [[ -z Ubuntu ]]
+ functions-common:is_suse_linux_enterprise:492 :   [[ Ubuntu =~ (^SUSE) ]]
+ lib/databases/mysql:configure_database_mysql:95 :   sudo mysqladmin -u root password password
mysqladmin: connect to server at 'localhost' failed
error: 'Access denied for user 'root'@'localhost' (using password: NO)'
+ lib/databases/mysql:configure_database_mysql:95 :   true
+ lib/databases/mysql:configure_database_mysql:99 :   is_ubuntu
+ functions-common:is_ubuntu:500           :   [[ -z deb ]]
+ functions-common:is_ubuntu:503           :   '[' deb = deb ']'
+ lib/databases/mysql:configure_database_mysql:99 :   '[' mysql == mariadb ']'
+ lib/databases/mysql:configure_database_mysql:102 :   local 'cmd_args=-uroot -ppassword -h127.0.0.1 '
... skipping 5849 lines ...
++ functions-common:get_or_create_user:847  :   local email=
+++ functions-common:get_or_create_user:861  :   oscwrap user create cinder --password '' --domain= --or-show -f value -c id
/usr/lib/python3/dist-packages/secretstorage/dhcrypto.py:15: CryptographyDeprecationWarning: int_from_bytes is deprecated, use int.from_bytes instead
  from cryptography.utils import int_from_bytes
/usr/lib/python3/dist-packages/secretstorage/util.py:19: CryptographyDeprecationWarning: int_from_bytes is deprecated, use int.from_bytes instead
  from cryptography.utils import int_from_bytes
No password was supplied, authentication will fail when a user does not have a password.
+++ functions-common:oscwrap:2382            :   return 0
++ functions-common:get_or_create_user:859  :   user_id=f4cd939210f2434c9f33ede7dbe0d408
++ functions-common:get_or_create_user:860  :   echo f4cd939210f2434c9f33ede7dbe0d408
+ lib/cinder:configure_cinder_internal_tenant:614 :   iniset /etc/cinder/cinder.conf DEFAULT cinder_internal_tenant_user_id f4cd939210f2434c9f33ede7dbe0d408
[8932 Async create_cinder_accounts:63533]: finished create_cinder_accounts with result 0 in 17 seconds
+ inc/async:async_wait:197                 :   return 0
... skipping 228 lines ...
+ functions:setup_systemd_logging:695      :   local pidstr=
+ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
+ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
+ functions:setup_systemd_logging:702      :   iniset /etc/neutron/neutron.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
+ functions:setup_systemd_logging:704      :   iniset /etc/neutron/neutron.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:705      :   iniset /etc/neutron/neutron.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
+ functions:setup_systemd_logging:706      :   iniset /etc/neutron/neutron.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
+ lib/neutron-legacy:_configure_neutron_common:800 :   is_service_enabled tls-proxy
+ functions-common:is_service_enabled:1998 :   return 1
+ lib/neutron-legacy:_configure_neutron_common:806 :   _neutron_setup_rootwrap
+ lib/neutron-legacy:_neutron_setup_rootwrap:949 :   [[ True == \F\a\l\s\e ]]
+ lib/neutron-legacy:_neutron_setup_rootwrap:953 :   Q_CONF_ROOTWRAP_D=/etc/neutron/rootwrap.d
+ lib/neutron-legacy:_neutron_setup_rootwrap:954 :   [[ -d /etc/neutron/rootwrap.d ]]
... skipping 177 lines ...
+ lib/neutron_plugins/openvswitch_agent:neutron_plugin_configure_plugin_agent:36 :   _neutron_ovs_base_setup_bridge br-int
+ lib/neutron_plugins/ovs_base:_neutron_ovs_base_setup_bridge:32 :   local bridge=br-int
+ lib/neutron_plugins/ovs_base:_neutron_ovs_base_setup_bridge:33 :   neutron-ovs-cleanup --config-file /etc/neutron/neutron.conf
INFO neutron.common.config [-] Logging enabled!
INFO neutron.common.config [-] /usr/local/bin/neutron-ovs-cleanup version 19.4.1.dev21
DEBUG neutron.common.config [-] command line: /usr/local/bin/neutron-ovs-cleanup --config-file /etc/neutron/neutron.conf {{(pid=68385) setup_logging /opt/stack/neutron/neutron/common/config.py:112}}
ERROR ovsdbapp.backend.ovs_idl.idlutils [-] Unable to open stream to tcp:127.0.0.1:6640 to retrieve schema: Connection refused
INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', '/usr/local/bin/neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--privsep_context', 'neutron.privileged.ovs_vsctl_cmd', '--privsep_sock_path', '/tmp/tmpjqge11ie/privsep.sock']
INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpjqge11ie/privsep.sock {{(pid=68385) __init__ /usr/local/lib/python3.8/dist-packages/oslo_privsep/daemon.py:363}}
INFO oslo.privsep.daemon [-] privsep daemon starting
INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
... skipping 1305 lines ...
++ functions:setup_systemd_logging:695      :   local pidstr=
++ functions:setup_systemd_logging:696      :   [[ False == \T\r\u\e ]]
++ functions:setup_systemd_logging:700      :   pidstr='(pid=%(process)d) '
++ functions:setup_systemd_logging:702      :   iniset /etc/octavia/octavia.conf DEFAULT logging_debug_format_suffix '{{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}}'
++ functions:setup_systemd_logging:704      :   iniset /etc/octavia/octavia.conf DEFAULT logging_context_format_string '%(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s'
++ functions:setup_systemd_logging:705      :   iniset /etc/octavia/octavia.conf DEFAULT logging_default_format_string '%(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s'
++ functions:setup_systemd_logging:706      :   iniset /etc/octavia/octavia.conf DEFAULT logging_exception_prefix 'ERROR %(name)s %(instance)s'
++ /opt/stack/octavia/devstack/plugin.sh:octavia_configure:302 :   iniset /etc/octavia/octavia.conf DEFAULT debug True
+++ /opt/stack/octavia/devstack/plugin.sh:octavia_configure:305 :   ipv6_unquote 10.0.2.15
+++ functions-common:ipv6_unquote:2140       :   echo 10.0.2.15
+++ functions-common:ipv6_unquote:2140       :   tr -d '[]'
++ /opt/stack/octavia/devstack/plugin.sh:octavia_configure:305 :   iniset /etc/octavia/octavia.conf api_settings bind_host 10.0.2.15
++ /opt/stack/octavia/devstack/plugin.sh:octavia_configure:306 :   iniset /etc/octavia/octavia.conf api_settings api_handler queue_producer
... skipping 120 lines ...
INFO alembic.runtime.migration [-] Running upgrade 9bf4d21caaea -> 4d9cf7d32f2, Insert headers
INFO alembic.runtime.migration [-] Running upgrade 4d9cf7d32f2 -> 36b94648fef8, add timestamp
INFO alembic.runtime.migration [-] Running upgrade 36b94648fef8 -> 62816c232310, Fix migration for MySQL 5.7
INFO alembic.runtime.migration [-] Running upgrade 62816c232310 -> 4a6ec0ab7284, Remove FK constraints on listener_statistics because it will be cross-DB
INFO alembic.runtime.migration [-] Running upgrade 4a6ec0ab7284 -> 82b9402e71fd, Update vip address size
INFO alembic.runtime.migration [-] Running upgrade 82b9402e71fd -> 9b5473976d6d, Add provisioning_status to objects
INFO alembic.runtime.migration [-] Running upgrade 9b5473976d6d -> c11292016060, adding request error number to listener_statistics table
INFO alembic.runtime.migration [-] Running upgrade c11292016060 -> 44a2414dd683, adding name column to member and health monitor
INFO alembic.runtime.migration [-] Running upgrade 44a2414dd683 -> 3f8ff3be828e, create quotas table
INFO alembic.runtime.migration [-] Running upgrade 3f8ff3be828e -> 443fe6676637, Add a column network_id in table vip
INFO alembic.runtime.migration [-] Running upgrade 443fe6676637 -> fc5582da7d8a, create_amphora_build_rate_limit_tables
INFO alembic.runtime.migration [-] Running upgrade fc5582da7d8a -> fac584114642, Add ID column to Healthmonitor table
INFO alembic.runtime.migration [-] Running upgrade fac584114642 -> 034b2dc2f3e0, modernize_l7policy_fields
... skipping 2866 lines ...
++ lib/neutron_plugins/services/l3:_neutron_get_ext_gw_interface:143 :   sudo ovs-vsctl set Bridge br-ex other_config:disable-in-band=true
++ lib/neutron_plugins/services/l3:_neutron_get_ext_gw_interface:145 :   echo br-ex
+ lib/neutron_plugins/services/l3:_neutron_configure_router_v4:337 :   ext_gw_interface=br-ex
+ lib/neutron_plugins/services/l3:_neutron_configure_router_v4:349 :   [[ br-ex != \n\o\n\e ]]
+ lib/neutron_plugins/services/l3:_neutron_configure_router_v4:350 :   local cidr_len=24
+ lib/neutron_plugins/services/l3:_neutron_configure_router_v4:351 :   local 'testcmd=ip -o link | grep -q br-ex'
+ lib/neutron_plugins/services/l3:_neutron_configure_router_v4:352 :   test_with_retry 'ip -o link | grep -q br-ex' 'br-ex creation failed'
+ functions-common:test_with_retry:2280    :   local 'testcmd=ip -o link | grep -q br-ex'
+ functions-common:test_with_retry:2281    :   local 'failmsg=br-ex creation failed'
+ functions-common:test_with_retry:2282    :   local until=10
+ functions-common:test_with_retry:2283    :   local sleep=0.5
+ functions-common:test_with_retry:2285    :   time_start test_with_retry
+ functions-common:time_start:2328         :   local name=test_with_retry
+ functions-common:time_start:2329         :   local start_time=
+ functions-common:time_start:2330         :   [[ -n '' ]]
... skipping 2315 lines ...
/usr/lib/python3/dist-packages/secretstorage/dhcrypto.py:15: CryptographyDeprecationWarning: int_from_bytes is deprecated, use int.from_bytes instead
  from cryptography.utils import int_from_bytes
/usr/lib/python3/dist-packages/secretstorage/util.py:19: CryptographyDeprecationWarning: int_from_bytes is deprecated, use int.from_bytes instead
  from cryptography.utils import int_from_bytes
+++ functions-common:oscwrap:2382            :   return 0
++ /opt/stack/octavia/devstack/plugin.sh:build_octavia_worker_image:89 :   TOKEN=gAAAAABjFZgjOzORikvhIeB12ngpFEOBw_IZzEjvccUaO_F8MoWJhWU1Y9paIQr5UUtt-nHeuxK5zS-e7g-z88FtSz2DBpsmATKRPso7qN_jEHRiW6If5jLzkuxRtpgg-_fBAbeaVOE5-_so5PsfBt6vUEgWt_GipeU2sBXK5tZUYbBdMcDdrnE
++ /opt/stack/octavia/devstack/plugin.sh:build_octavia_worker_image:90 :   die_if_not_set 90 TOKEN 'Keystone failed to get token.'
++ functions-common:die_if_not_set:216      :   local exitcode=0
++ /opt/stack/octavia/devstack/plugin.sh:build_octavia_worker_image:92 :   octavia_dib_tracing_arg=
++ /opt/stack/octavia/devstack/plugin.sh:build_octavia_worker_image:93 :   '[' 1 '!=' 0 ']'
++ /opt/stack/octavia/devstack/plugin.sh:build_octavia_worker_image:94 :   octavia_dib_tracing_arg=-x
++ /opt/stack/octavia/devstack/plugin.sh:build_octavia_worker_image:96 :   [[ -n '' ]]
++ /opt/stack/octavia/devstack/plugin.sh:build_octavia_worker_image:99 :   [[ -n '' ]]
... skipping 1365 lines ...
+ functions:wait_for_compute:493           :   [[ 0 != 0 ]]
+ functions:wait_for_compute:497           :   return 0
+ ./stack.sh:main:1442                     :   service_check
+ functions-common:service_check:1635      :   local service
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@rabbit.service
Failed to get unit file state for devstack@rabbit.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@mysql.service
Failed to get unit file state for devstack@mysql.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@key.service
Failed to get unit file state for devstack@key.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@n-api.service
enabled
+ functions-common:service_check:1642      :   sudo systemctl status devstack@n-api.service --no-pager
● devstack@n-api.service - Devstack devstack@n-api.service
     Loaded: loaded (/etc/systemd/system/devstack@n-api.service; enabled; vendor preset: enabled)
... skipping 6 lines ...
             ├─79471 nova-apiuWSGI master
             ├─79472 nova-apiuWSGI worker 1
             └─79473 nova-apiuWSGI worker 2

Sep 05 06:33:50 devstack devstack@n-api.service[79473]: [pid: 79473|app: 0|req: 24/47] 10.0.2.15 () {56 vars in 1201 bytes} [Mon Sep  5 06:33:50 2022] GET /compute/v2.1/flavors/auto/os-flavor-access => generated 21 bytes in 9 msecs (HTTP/1.1 200) 9 headers in 356 bytes (1 switches on core 0)
Sep 05 06:34:18 devstack devstack@n-api.service[79472]: DEBUG nova.api.openstack.wsgi [None req-fd940f09-2eb8-4528-a50d-322459803218 admin admin] Calling method '<bound method ServiceController.index of <nova.api.openstack.compute.services.ServiceController object at 0x7f47a1c9d820>>' {{(pid=79472) _process_stack /opt/stack/nova/nova/api/openstack/wsgi.py:513}}
Sep 05 06:34:18 devstack devstack@n-api.service[79472]: /usr/local/lib/python3.8/dist-packages/oslo_policy/policy.py:1075: UserWarning: Policy os_compute_api:os-services:list failed scope check. The token used to make the request was project scoped but the policy requires ['system'] scope. This behavior may change in the future where using the intended scope is required
Sep 05 06:34:18 devstack devstack@n-api.service[79472]:   warnings.warn(msg)
Sep 05 06:34:18 devstack devstack@n-api.service[79472]: DEBUG oslo_concurrency.lockutils [None req-fd940f09-2eb8-4528-a50d-322459803218 admin admin] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=79472) inner /usr/local/lib/python3.8/dist-packages/oslo_concurrency/lockutils.py:355}}
Sep 05 06:34:18 devstack devstack@n-api.service[79472]: DEBUG oslo_concurrency.lockutils [None req-fd940f09-2eb8-4528-a50d-322459803218 admin admin] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=79472) inner /usr/local/lib/python3.8/dist-packages/oslo_concurrency/lockutils.py:367}}
Sep 05 06:34:18 devstack devstack@n-api.service[79472]: DEBUG oslo_concurrency.lockutils [None req-fd940f09-2eb8-4528-a50d-322459803218 admin admin] Lock "fd0ac89b-f661-4008-b8ef-f84695c7c7a0" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=79472) inner /usr/local/lib/python3.8/dist-packages/oslo_concurrency/lockutils.py:355}}
Sep 05 06:34:18 devstack devstack@n-api.service[79472]: DEBUG oslo_concurrency.lockutils [None req-fd940f09-2eb8-4528-a50d-322459803218 admin admin] Lock "fd0ac89b-f661-4008-b8ef-f84695c7c7a0" released by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=79472) inner /usr/local/lib/python3.8/dist-packages/oslo_concurrency/lockutils.py:367}}
Sep 05 06:34:18 devstack devstack@n-api.service[79472]: INFO nova.api.openstack.requestlog [None req-fd940f09-2eb8-4528-a50d-322459803218 admin admin] 10.0.2.15 "GET /compute/v2.1/os-services?host=devstack&binary=nova-compute" status: 200 len: 192 microversion: 2.1 time: 0.047525
... skipping 20 lines ...
Sep 05 06:33:35 devstack nova-compute[86807]: DEBUG nova.compute.provider_tree [None req-bdd02d8c-0e87-4ca2-a841-2cd852b568fb None None] Inventory has not changed in ProviderTree for provider: 3a3a1575-e870-42ec-b73b-404dc6d4951e {{(pid=86807) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:179}}
Sep 05 06:33:35 devstack nova-compute[86807]: DEBUG nova.scheduler.client.report [None req-bdd02d8c-0e87-4ca2-a841-2cd852b568fb None None] Inventory has not changed for provider 3a3a1575-e870-42ec-b73b-404dc6d4951e based on inventory data: {'MEMORY_MB': {'total': 32109, 'reserved': 512, 'min_unit': 1, 'max_unit': 32109, 'step_size': 1, 'allocation_ratio': 1.5}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 28, 'reserved': 0, 'min_unit': 1, 'max_unit': 28, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=86807) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:894}}
Sep 05 06:33:35 devstack nova-compute[86807]: DEBUG nova.compute.resource_tracker [None req-bdd02d8c-0e87-4ca2-a841-2cd852b568fb None None] Compute_service record updated for devstack:devstack {{(pid=86807) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1003}}
Sep 05 06:33:35 devstack nova-compute[86807]: DEBUG oslo_concurrency.lockutils [None req-bdd02d8c-0e87-4ca2-a841-2cd852b568fb None None] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.089s {{(pid=86807) inner /usr/local/lib/python3.8/dist-packages/oslo_concurrency/lockutils.py:367}}
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@n-cond.service
Failed to get unit file state for devstack@n-cond.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@n-sch.service
enabled
+ functions-common:service_check:1642      :   sudo systemctl status devstack@n-sch.service --no-pager
● devstack@n-sch.service - Devstack devstack@n-sch.service
     Loaded: loaded (/etc/systemd/system/devstack@n-sch.service; enabled; vendor preset: enabled)
... skipping 68 lines ...
Sep 05 06:32:39 devstack devstack@placement-api.service[80708]: [pid: 80708|app: 0|req: 5/10] 10.0.2.15 () {64 vars in 1463 bytes} [Mon Sep  5 06:32:39 2022] PUT /placement/resource_providers/3a3a1575-e870-42ec-b73b-404dc6d4951e/traits => generated 1324 bytes in 51 msecs (HTTP/1.1 200) 6 headers in 225 bytes (1 switches on core 0)
Sep 05 06:33:35 devstack devstack@placement-api.service[80707]: DEBUG placement.requestlog [req-298e5488-7282-4874-8056-fc4312bb1dd4 req-9af5abcb-61df-4f7b-9e8a-458ebd92243c None None] Starting request: 10.0.2.15 "GET /placement/resource_providers/3a3a1575-e870-42ec-b73b-404dc6d4951e/allocations" {{(pid=80707) __call__ /opt/stack/placement/placement/requestlog.py:55}}
Sep 05 06:33:35 devstack devstack@placement-api.service[80707]: INFO placement.requestlog [req-298e5488-7282-4874-8056-fc4312bb1dd4 req-9af5abcb-61df-4f7b-9e8a-458ebd92243c service placement] 10.0.2.15 "GET /placement/resource_providers/3a3a1575-e870-42ec-b73b-404dc6d4951e/allocations" status: 200 len: 54 microversion: 1.0
Sep 05 06:33:35 devstack devstack@placement-api.service[80707]: [pid: 80707|app: 0|req: 6/11] 10.0.2.15 () {58 vars in 1381 bytes} [Mon Sep  5 06:33:35 2022] GET /placement/resource_providers/3a3a1575-e870-42ec-b73b-404dc6d4951e/allocations => generated 54 bytes in 12 msecs (HTTP/1.1 200) 6 headers in 223 bytes (1 switches on core 0)
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@placement-client.service
Failed to get unit file state for devstack@placement-client.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@g-api.service
enabled
+ functions-common:service_check:1642      :   sudo systemctl status devstack@g-api.service --no-pager
● devstack@g-api.service - Devstack devstack@g-api.service
     Loaded: loaded (/etc/systemd/system/devstack@g-api.service; enabled; vendor preset: enabled)
... skipping 16 lines ...
Sep 05 06:33:54 devstack devstack@g-api.service[89288]: DEBUG glance.api.middleware.version_negotiation [None req-d07ed26b-481f-4844-a3c8-469206cab750 admin admin] Using url versioning {{(pid=89288) process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:57}}
Sep 05 06:33:54 devstack devstack@g-api.service[89288]: DEBUG glance.api.middleware.version_negotiation [None req-d07ed26b-481f-4844-a3c8-469206cab750 admin admin] Matched version: v2 {{(pid=89288) process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:69}}
Sep 05 06:33:54 devstack devstack@g-api.service[89288]: DEBUG glance.api.middleware.version_negotiation [None req-d07ed26b-481f-4844-a3c8-469206cab750 admin admin] new path /v2/images/359d2f1a-0013-4dca-8b63-90ae9bfb3888 {{(pid=89288) process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:70}}
Sep 05 06:33:54 devstack devstack@g-api.service[89288]: [pid: 89288|app: 0|req: 10/18] 127.0.0.1 () {36 vars in 783 bytes} [Mon Sep  5 06:33:54 2022] GET /v2/images/359d2f1a-0013-4dca-8b63-90ae9bfb3888 => generated 1006 bytes in 32 msecs (HTTP/1.1 200) 4 headers in 158 bytes (1 switches on core 0)
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@g-reg.service
Failed to get unit file state for devstack@g-reg.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@cinder.service
Failed to get unit file state for devstack@cinder.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@c-api.service
enabled
+ functions-common:service_check:1642      :   sudo systemctl status devstack@c-api.service --no-pager
● devstack@c-api.service - Devstack devstack@c-api.service
     Loaded: loaded (/etc/systemd/system/devstack@c-api.service; enabled; vendor preset: enabled)
... skipping 80 lines ...
             ├─80200 neutron-server: api worker (/usr/bin/python3.8 /usr/local/bin/neutron-server --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/ml2/ml2_conf.ini)
             ├─80201 neutron-server: rpc worker (/usr/bin/python3.8 /usr/local/bin/neutron-server --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/ml2/ml2_conf.ini)
             └─80202 neutron-server: rpc worker (/usr/bin/python3.8 /usr/local/bin/neutron-server --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/ml2/ml2_conf.ini)

Sep 05 06:33:51 devstack neutron-server[80201]: DEBUG neutron.plugins.ml2.plugin [None req-80d757d6-5fd1-45cf-8b91-74c38c700bed None None] Current status of the port 7fa68daf-a3f3-4d7c-87e8-e1e3fa95d75e is: ACTIVE; New status is: ACTIVE {{(pid=80201) _update_individual_port_db_status /opt/stack/neutron/neutron/plugins/ml2/plugin.py:2250}}
Sep 05 06:33:51 devstack neutron-server[80199]: DEBUG neutron.wsgi [-] (80199) accepted ('10.0.2.15', 60886) {{(pid=80199) server /usr/local/lib/python3.8/dist-packages/eventlet/wsgi.py:982}}
Sep 05 06:33:51 devstack neutron-server[80199]: INFO neutron.api.v2.resource [None req-a9a56b14-2509-47b4-b9ad-37598155de68 admin admin] show failed (client error): The resource could not be found.
Sep 05 06:33:51 devstack neutron-server[80199]: INFO neutron.wsgi [None req-a9a56b14-2509-47b4-b9ad-37598155de68 admin admin] 10.0.2.15 "GET /v2.0/security-groups/lb-mgmt-sec-grp HTTP/1.1" status: 404  len: 331 time: 0.0271063
Sep 05 06:33:52 devstack neutron-server[80199]: INFO neutron.wsgi [None req-c6255d20-e41e-4241-ac71-95df75656d2e admin admin] 10.0.2.15 "GET /v2.0/security-groups?name=lb-mgmt-sec-grp HTTP/1.1" status: 200  len: 3532 time: 0.0194645
Sep 05 06:34:08 devstack neutron-server[80201]: DEBUG neutron_lib.callbacks.manager [None req-59e37392-11dc-4bc4-8e44-391c2095ca23 None None] Notify callbacks ['neutron.services.segments.db._update_segment_host_mapping_for_agent-8760377083832', 'neutron.plugins.ml2.plugin.Ml2Plugin._retry_binding_revived_agents-1669544'] for agent, after_update {{(pid=80201) _notify_loop /usr/local/lib/python3.8/dist-packages/neutron_lib/callbacks/manager.py:192}}
Sep 05 06:34:09 devstack neutron-server[80201]: DEBUG neutron_lib.callbacks.manager [None req-dfbbc9f4-b99c-460a-b25e-f5a787a49e9c None None] Notify callbacks ['neutron.services.segments.db._update_segment_host_mapping_for_agent-8760377083832', 'neutron.plugins.ml2.plugin.Ml2Plugin._retry_binding_revived_agents-1669544'] for agent, after_update {{(pid=80201) _notify_loop /usr/local/lib/python3.8/dist-packages/neutron_lib/callbacks/manager.py:192}}
Sep 05 06:34:09 devstack neutron-server[80201]: DEBUG neutron_lib.callbacks.manager [None req-de6d43d9-2a1c-4a1a-a478-fd9ebebb7aee None None] Notify callbacks ['neutron.services.segments.db._update_segment_host_mapping_for_agent-8760377083832', 'neutron.plugins.ml2.plugin.Ml2Plugin._retry_binding_revived_agents-1669544'] for agent, after_update {{(pid=80201) _notify_loop /usr/local/lib/python3.8/dist-packages/neutron_lib/callbacks/manager.py:192}}
Sep 05 06:34:13 devstack neutron-server[80201]: DEBUG neutron_lib.callbacks.manager [None req-80d757d6-5fd1-45cf-8b91-74c38c700bed None None] Notify callbacks ['neutron.services.segments.db._update_segment_host_mapping_for_agent-8760377083832', 'neutron.plugins.ml2.plugin.Ml2Plugin._retry_binding_revived_agents-1669544'] for agent, after_update {{(pid=80201) _notify_loop /usr/local/lib/python3.8/dist-packages/neutron_lib/callbacks/manager.py:192}}
... skipping 41 lines ...
             ├─83096 /usr/bin/python3.8 /usr/local/bin/privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/dhcp_agent.ini --privsep_context neutron.privileged.link_cmd --privsep_sock_path /tmp/tmp2f9j9kpx/privsep.sock
             ├─83223 sudo /usr/local/bin/neutron-rootwrap-daemon /etc/neutron/rootwrap.conf
             ├─83224 /usr/bin/python3.8 /usr/local/bin/neutron-rootwrap-daemon /etc/neutron/rootwrap.conf
             ├─83234 dnsmasq --no-hosts  --pid-file=/opt/stack/data/neutron/dhcp/e82d08bb-a45a-40cb-827a-47817ae5eeb6/pid --dhcp-hostsfile=/opt/stack/data/neutron/dhcp/e82d08bb-a45a-40cb-827a-47817ae5eeb6/host --addn-hosts=/opt/stack/data/neutron/dhcp/e82d08bb-a45a-40cb-827a-47817ae5eeb6/addn_hosts --dhcp-optsfile=/opt/stack/data/neutron/dhcp/e82d08bb-a45a-40cb-827a-47817ae5eeb6/opts --dhcp-leasefile=/opt/stack/data/neutron/dhcp/e82d08bb-a45a-40cb-827a-47817ae5eeb6/leases --dhcp-match=set:ipxe,175 --dhcp-userclass=set:ipxe6,iPXE --local-service --bind-dynamic --dhcp-range=set:subnet-3e8883b7-3e13-443e-8fde-130228fd23c9,10.1.0.0,static,255.255.255.192,86400s --dhcp-option-force=option:mtu,1380 --dhcp-lease-max=64 --conf-file=/dev/null --domain=openstacklocal
             └─92154 dnsmasq --no-hosts  --pid-file=/opt/stack/data/neutron/dhcp/e6a9ade1-2da8-46b5-ae62-333e60b2d570/pid --dhcp-hostsfile=/opt/stack/data/neutron/dhcp/e6a9ade1-2da8-46b5-ae62-333e60b2d570/host --addn-hosts=/opt/stack/data/neutron/dhcp/e6a9ade1-2da8-46b5-ae62-333e60b2d570/addn_hosts --dhcp-optsfile=/opt/stack/data/neutron/dhcp/e6a9ade1-2da8-46b5-ae62-333e60b2d570/opts --dhcp-leasefile=/opt/stack/data/neutron/dhcp/e6a9ade1-2da8-46b5-ae62-333e60b2d570/leases --dhcp-match=set:ipxe,175 --dhcp-userclass=set:ipxe6,iPXE --local-service --bind-dynamic --dhcp-range=set:subnet-c3fe6159-4fc3-4bc6-878e-ce4c9840ab26,10.51.0.0,static,255.255.0.0,86400s --dhcp-option-force=option:mtu,1380 --dhcp-lease-max=65536 --conf-file=/dev/null --domain=openstacklocal

Sep 05 06:33:46 devstack neutron-dhcp-agent[81587]: DEBUG neutron.agent.linux.utils [-] Unable to access /opt/stack/data/neutron/external/pids/e6a9ade1-2da8-46b5-ae62-333e60b2d570.pid.haproxy; Error: [Errno 2] No such file or directory: '/opt/stack/data/neutron/external/pids/e6a9ade1-2da8-46b5-ae62-333e60b2d570.pid.haproxy' {{(pid=81587) get_value_from_file /opt/stack/neutron/neutron/agent/linux/utils.py:252}}
Sep 05 06:33:46 devstack neutron-dhcp-agent[81587]: DEBUG oslo.privsep.daemon [-] privsep: reply[77aef7fd-05a3-4a66-8a06-bed82c6411ad]: (4, None) {{(pid=83005) _call_back /usr/local/lib/python3.8/dist-packages/oslo_privsep/daemon.py:502}}
Sep 05 06:33:46 devstack neutron-dhcp-agent[81587]: DEBUG oslo.privsep.daemon [-] privsep: reply[410b5ab1-f40b-47b5-b3bd-476c5d6ff09d]: (4, None) {{(pid=83005) _call_back /usr/local/lib/python3.8/dist-packages/oslo_privsep/daemon.py:502}}
Sep 05 06:33:46 devstack neutron-dhcp-agent[81587]: INFO neutron.agent.dhcp.agent [None req-9d879b8c-b0cc-4f9b-8fbf-4a71a012c44a None None] DHCP configuration for ports {'7fa68daf-a3f3-4d7c-87e8-e1e3fa95d75e'} is completed
Sep 05 06:33:54 devstack dnsmasq-dhcp[92154]: DHCPDISCOVER(tape06c52c6-42) fa:16:3e:96:26:51
Sep 05 06:33:54 devstack dnsmasq-dhcp[92154]: DHCPOFFER(tape06c52c6-42) 10.51.0.41 fa:16:3e:96:26:51
Sep 05 06:33:54 devstack dnsmasq-dhcp[92154]: DHCPREQUEST(tape06c52c6-42) 10.51.0.41 fa:16:3e:96:26:51
... skipping 53 lines ...
Sep 05 06:32:09 devstack neutron-metadata-agent[82397]: DEBUG oslo_service.service [None req-de6d43d9-2a1c-4a1a-a478-fd9ebebb7aee None None] oslo_messaging_notifications.retry = -1 {{(pid=82397) log_opt_values /usr/local/lib/python3.8/dist-packages/oslo_config/cfg.py:2609}}
Sep 05 06:32:09 devstack neutron-metadata-agent[82397]: DEBUG oslo_service.service [None req-de6d43d9-2a1c-4a1a-a478-fd9ebebb7aee None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=82397) log_opt_values /usr/local/lib/python3.8/dist-packages/oslo_config/cfg.py:2609}}
Sep 05 06:32:09 devstack neutron-metadata-agent[82397]: DEBUG oslo_service.service [None req-de6d43d9-2a1c-4a1a-a478-fd9ebebb7aee None None] oslo_messaging_notifications.transport_url = **** {{(pid=82397) log_opt_values /usr/local/lib/python3.8/dist-packages/oslo_config/cfg.py:2609}}
Sep 05 06:32:09 devstack neutron-metadata-agent[82397]: DEBUG oslo_service.service [None req-de6d43d9-2a1c-4a1a-a478-fd9ebebb7aee None None] ******************************************************************************** {{(pid=82397) log_opt_values /usr/local/lib/python3.8/dist-packages/oslo_config/cfg.py:2613}}
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@octavia.service
Failed to get unit file state for devstack@octavia.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@o-cw.service
enabled
+ functions-common:service_check:1642      :   sudo systemctl status devstack@o-cw.service --no-pager
● devstack@o-cw.service - Devstack devstack@o-cw.service
     Loaded: loaded (/etc/systemd/system/devstack@o-cw.service; enabled; vendor preset: enabled)
... skipping 89 lines ...
Sep 05 06:33:58 devstack devstack@o-api.service[93875]: WSGI app 0 (mountpoint='') ready in 3 seconds on interpreter 0x55b7aec34ae0 pid: 93875 (default app)
Sep 05 06:33:58 devstack devstack@o-api.service[93874]: WARNING keystonemiddleware.auth_token [-] AuthToken middleware is set with keystone_authtoken.service_token_roles_required set to False. This is backwards compatible but deprecated behaviour. Please set this to True.
Sep 05 06:33:58 devstack devstack@o-api.service[93874]: WARNING keystonemiddleware.auth_token [-] Configuring www_authenticate_uri to point to the public identity endpoint is required; clients may not be able to authenticate against an admin endpoint
Sep 05 06:33:58 devstack devstack@o-api.service[93874]: WSGI app 0 (mountpoint='') ready in 3 seconds on interpreter 0x55b7aec34ae0 pid: 93874 (default app)
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@barbican-vault.service
Failed to get unit file state for devstack@barbican-vault.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@barbican.service
Failed to get unit file state for devstack@barbican.service: No such file or directory
+ functions-common:service_check:1636      :   for service in ${ENABLED_SERVICES//,/ }
+ functions-common:service_check:1638      :   sudo systemctl is-enabled devstack@n-super-cond.service
enabled
+ functions-common:service_check:1642      :   sudo systemctl status devstack@n-super-cond.service --no-pager
● devstack@n-super-cond.service - Devstack devstack@n-super-cond.service
     Loaded: loaded (/etc/systemd/system/devstack@n-super-cond.service; enabled; vendor preset: enabled)
... skipping 1604 lines ...
    "port": 22,
    "search_regex": "OpenSSH",
    "state": "started"
}

TASK [install-k3s : Wait until k3s installation is finished] *******************
FAILED - RETRYING: Wait until k3s installation is finished (100 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (99 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (98 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (97 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (96 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (95 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (94 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (93 retries left).
FAILED - RETRYING: Wait until k3s installation is finished (92 retries left).
changed: [34.150.188.9] => {
    "attempts": 10,
    "changed": true,
    "cmd": "ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -i /root/.ssh/id_rsa ubuntu@172.24.5.192 sudo chmod 777 /etc/rancher/k3s/k3s.yaml\n",
    "delta": "0:00:00.950320",
    "end": "2022-09-05 06:37:08.474008",
... skipping 29 lines ...
+ curl -sLO# https://storage.googleapis.com/kubernetes-release/release/v1.25.0/bin/linux/amd64/kubectl
+ chmod +x ./kubectl
+ sudo mv ./kubectl /usr/local/bin/kubectl
+ kubectl config set-cluster default --server=https://172.24.5.192:6443 --kubeconfig /root/.kube/config

TASK [install-k3s : Wait for k8s node ready] ***********************************
FAILED - RETRYING: Wait for k8s node ready (12 retries left).
changed: [34.150.188.9] => {
    "attempts": 2,
    "changed": true,
    "cmd": "kubectl --kubeconfig /root/.kube/config get node | grep \" Ready \" | wc -l\n",
    "delta": "0:00:00.057951",
    "end": "2022-09-05 06:37:22.177624",
... skipping 644 lines ...
STDERR:

+ cat
+ kubectl apply -f -

TASK [install-cpo-occm : Wait for openstack-cloud-controller-manager up and running] ***
FAILED - RETRYING: Wait for openstack-cloud-controller-manager up and running (24 retries left).
changed: [34.150.188.9] => {
    "attempts": 2,
    "changed": true,
    "cmd": "kubectl -n kube-system get pod | grep openstack-cloud-controller-manager | grep Running\n",
    "delta": "0:00:00.056813",
    "end": "2022-09-05 06:39:34.298788",
... skipping 2 lines ...
}

STDOUT:

openstack-cloud-controller-manager-6fc58bf899-nnzgv   1/1     Running   0          7s

TASK [install-cpo-occm : Describe failed openstack-cloud-controller-manager] ***
skipping: [34.150.188.9] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [install-cpo-occm : Log failed openstack-cloud-controller-manager deployment] ***
skipping: [34.150.188.9] => {}

TASK [install-cpo-occm : Show openstack-cloud-controller-manager pod logs] *****
skipping: [34.150.188.9] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
... skipping 410 lines ...
+ tests/e2e/cloudprovider/test-lb-service.sh:test_basic:261 :   wait_address_accessible 172.24.5.125
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:116 :   local addr=172.24.5.125
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:118 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:118 :   end=1662360347
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360052
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360052 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360057
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360057 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360062
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360062 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360067
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360067 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360072
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360072 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360077
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360077 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360082
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360082 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360087
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360087 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360092
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360092 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360097
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360097 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360102
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360102 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360107
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360107 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360112
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360112 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360117
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360117 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360122
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360122 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360127
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360127 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360132
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360132 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360137
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360137 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.125
curl: (7) Failed to connect to 172.24.5.125 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360142
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360142 -gt 1662360347 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
... skipping 95 lines ...
+ tests/e2e/cloudprovider/test-lb-service.sh:test_forwarded:309 :   wait_address_accessible 172.24.5.64
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:116 :   local addr=172.24.5.64
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:118 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:118 :   end=1662360498
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360203
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360203 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360208
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360208 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360213
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360213 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360218
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360218 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360223
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360223 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360228
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360228 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360233
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360233 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360238
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360238 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360243
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360243 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360248
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360248 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360253
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360253 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360258
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360258 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360263
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360263 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360268
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360268 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360273
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360273 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360278
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360278 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360283
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360283 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360288
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360288 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:120 :   curl -sS http://172.24.5.64
curl: (7) Failed to connect to 172.24.5.64 port 80: Connection refused
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:121 :   '[' 7 -eq 0 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:124 :   sleep 5
++ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   date +%s
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:125 :   now=1662360293
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:126 :   '[' 1662360293 -gt 1662360498 ']'
+ tests/e2e/cloudprovider/test-lb-service.sh:wait_address_accessible:119 :   true
... skipping 101 lines ...
        "operating_status": "ONLINE",
        "provisioning_status": "ACTIVE",
        "listeners": [
            {
                "id": "652699f8-cd98-401d-8445-2164336584c4",
                "name": "",
                "operating_status": "ERROR",
                "provisioning_status": "PENDING_CREATE",
                "pools": [
                    {
                        "id": "80b6bc7e-2d9d-434e-939d-51cc9340d69c",
                        "name": "TCP_80_pool",
                        "provisioning_status": "PENDING_CREATE",
... skipping 11 lines ...
                    }
                ]
            },
            {
                "id": "62c9312d-827b-4729-a4d9-2e33c2eea740",
                "name": "",
                "operating_status": "ERROR",
                "provisioning_status": "PENDING_CREATE",
                "pools": [
                    {
                        "id": "5d0413a9-39d2-4b0c-bc79-eb4b8f47d1aa",
                        "name": "TCP_8080_pool",
                        "provisioning_status": "PENDING_CREATE",
... skipping 11 lines ...
                    }
                ]
            }
        ]
    }
}'
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:367 :   echo '{' '"loadbalancer":' '{' '"id":' '"6ad86044-e7e9-41cb-ba0b-0d79cdcf115c",' '"name":' '"kube_service_kubernetes_octavia-lb-test_test-update-port",' '"operating_status":' '"ONLINE",' '"provisioning_status":' '"ACTIVE",' '"listeners":' '[' '{' '"id":' '"652699f8-cd98-401d-8445-2164336584c4",' '"name":' '"",' '"operating_status":' '"ERROR",' '"provisioning_status":' '"PENDING_CREATE",' '"pools":' '[' '{' '"id":' '"80b6bc7e-2d9d-434e-939d-51cc9340d69c",' '"name":' '"TCP_80_pool",' '"provisioning_status":' '"PENDING_CREATE",' '"operating_status":' '"OFFLINE",' '"members":' '[' '{' '"id":' '"cea536cb-f9f9-400d-a5f4-fb27682e61d7",' '"name":' '"k3s-master",' '"operating_status":' '"NO_MONITOR",' '"provisioning_status":' '"PENDING_CREATE",' '"address":' '"10.1.0.9",' '"protocol_port":' 30935 '}' ']' '}' ']' '},' '{' '"id":' '"62c9312d-827b-4729-a4d9-2e33c2eea740",' '"name":' '"",' '"operating_status":' '"ERROR",' '"provisioning_status":' '"PENDING_CREATE",' '"pools":' '[' '{' '"id":' '"5d0413a9-39d2-4b0c-bc79-eb4b8f47d1aa",' '"name":' '"TCP_8080_pool",' '"provisioning_status":' '"PENDING_CREATE",' '"operating_status":' '"OFFLINE",' '"members":' '[' '{' '"id":' '"7816e813-8563-4645-8dbc-4b26a905460c",' '"name":' '"k3s-master",' '"operating_status":' '"NO_MONITOR",' '"provisioning_status":' '"PENDING_CREATE",' '"address":' '"10.1.0.9",' '"protocol_port":' 31617 '}' ']' '}' ']' '}' ']' '}' '}'
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:367 :   jq '.loadbalancer.listeners | length'
+ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:367 :   listener_count=2
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:368 :   echo '{' '"loadbalancer":' '{' '"id":' '"6ad86044-e7e9-41cb-ba0b-0d79cdcf115c",' '"name":' '"kube_service_kubernetes_octavia-lb-test_test-update-port",' '"operating_status":' '"ONLINE",' '"provisioning_status":' '"ACTIVE",' '"listeners":' '[' '{' '"id":' '"652699f8-cd98-401d-8445-2164336584c4",' '"name":' '"",' '"operating_status":' '"ERROR",' '"provisioning_status":' '"PENDING_CREATE",' '"pools":' '[' '{' '"id":' '"80b6bc7e-2d9d-434e-939d-51cc9340d69c",' '"name":' '"TCP_80_pool",' '"provisioning_status":' '"PENDING_CREATE",' '"operating_status":' '"OFFLINE",' '"members":' '[' '{' '"id":' '"cea536cb-f9f9-400d-a5f4-fb27682e61d7",' '"name":' '"k3s-master",' '"operating_status":' '"NO_MONITOR",' '"provisioning_status":' '"PENDING_CREATE",' '"address":' '"10.1.0.9",' '"protocol_port":' 30935 '}' ']' '}' ']' '},' '{' '"id":' '"62c9312d-827b-4729-a4d9-2e33c2eea740",' '"name":' '"",' '"operating_status":' '"ERROR",' '"provisioning_status":' '"PENDING_CREATE",' '"pools":' '[' '{' '"id":' '"5d0413a9-39d2-4b0c-bc79-eb4b8f47d1aa",' '"name":' '"TCP_8080_pool",' '"provisioning_status":' '"PENDING_CREATE",' '"operating_status":' '"OFFLINE",' '"members":' '[' '{' '"id":' '"7816e813-8563-4645-8dbc-4b26a905460c",' '"name":' '"k3s-master",' '"operating_status":' '"NO_MONITOR",' '"provisioning_status":' '"PENDING_CREATE",' '"address":' '"10.1.0.9",' '"protocol_port":' 31617 '}' ']' '}' ']' '}' ']' '}' '}'
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:368 :   jq '.loadbalancer.listeners | .[].pools | .[].members | .[].protocol_port'
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:368 :   uniq
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:368 :   tr '\n' ' '
+ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:368 :   member_ports='30935 31617 '
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:369 :   kubectl -n octavia-lb-test get svc test-update-port -o json
++ tests/e2e/cloudprovider/test-lb-service.sh:test_update_port:369 :   jq '.spec.ports | .[].nodePort'
... skipping 1003 lines ...
        "WatchdogTimestampMonotonic": "0",
        "WatchdogUSec": "0"
    }
}

PLAY RECAP *********************************************************************
34.150.188.9               : ok=51   changed=41   unreachable=0    failed=0    skipped=5    rescued=0    ignored=1   

+ exit_code=0
+ scp -i /root/.ssh/google_compute_engine -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -r root@34.150.188.9:/var/log/dib-build/amphora-x64-haproxy.qcow2.log /logs/artifacts/logs/octavia-image-build.log
Warning: Permanently added '34.150.188.9' (ECDSA) to the list of known hosts.
scp: /var/log/dib-build/amphora-x64-haproxy.qcow2.log: No such file or directory
+ true
... skipping 15 lines ...