HA Openstack Ussuri unable to log into Dashboard

Hey everyone please give me some advice.

For a while now I’ve been attempting to build a HA Implementation of the Openstack Ussuri Base bundle using MaaS 2.8 and Juju 2.8

I get to the point where the whole model is stable but when I try log into the Dashboard I’m getting ERR_EMPTY_RESPONSE.

Here’s my juju status and my my juju bundle

I’m really lost and don’t know what to do next.
The only thing of interest I’ve managed to find is in /var/log/apache2/error.log of the Dashboard units. Seems like Neutron is locking up everything

[Mon Aug 03 23:44:49.780795 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668] Recoverable error: Connection to neutron failed: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
[Mon Aug 03 23:45:19.808348 2020] [wsgi:error] [pid 131366:tid 139651134682880] [remote 192.168.51.32:41508] Recoverable error: Connection to neutron failed: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
[Mon Aug 03 23:46:19.812267 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668] Internal Server Error: /horizon/project/
[Mon Aug 03 23:46:19.812334 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668] Traceback (most recent call last):
[Mon Aug 03 23:46:19.812345 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 665, in urlopen
[Mon Aug 03 23:46:19.812354 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     httplib_response = self._make_request(
[Mon Aug 03 23:46:19.812361 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 421, in _make_request
[Mon Aug 03 23:46:19.812369 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     six.raise_from(e, None)
[Mon Aug 03 23:46:19.812376 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "<string>", line 3, in raise_from
[Mon Aug 03 23:46:19.812384 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 416, in _make_request
[Mon Aug 03 23:46:19.812392 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     httplib_response = conn.getresponse()
[Mon Aug 03 23:46:19.812415 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3.8/http/client.py", line 1332, in getresponse
[Mon Aug 03 23:46:19.812423 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     response.begin()
[Mon Aug 03 23:46:19.812430 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3.8/http/client.py", line 303, in begin
[Mon Aug 03 23:46:19.812437 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     version, status, reason = self._read_status()
[Mon Aug 03 23:46:19.812443 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3.8/http/client.py", line 272, in _read_status
[Mon Aug 03 23:46:19.812450 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     raise RemoteDisconnected("Remote end closed connection without"
[Mon Aug 03 23:46:19.812457 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668] http.client.RemoteDisconnected: Remote end closed connection without response
[Mon Aug 03 23:46:19.812464 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]
[Mon Aug 03 23:46:19.812470 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668] During handling of the above exception, another exception occurred:
[Mon Aug 03 23:46:19.812477 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]
[Mon Aug 03 23:46:19.812483 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668] Traceback (most recent call last):
[Mon Aug 03 23:46:19.812490 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/requests/adapters.py", line 439, in send
[Mon Aug 03 23:46:19.812497 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     resp = conn.urlopen(
[Mon Aug 03 23:46:19.812503 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 719, in urlopen
[Mon Aug 03 23:46:19.812510 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     retries = retries.increment(
[Mon Aug 03 23:46:19.812517 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 400, in increment
[Mon Aug 03 23:46:19.812524 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     raise six.reraise(type(error), error, _stacktrace)
[Mon Aug 03 23:46:19.812531 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/six.py", line 702, in reraise
[Mon Aug 03 23:46:19.812538 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     raise value.with_traceback(tb)
[Mon Aug 03 23:46:19.812544 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 665, in urlopen
[Mon Aug 03 23:46:19.812551 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     httplib_response = self._make_request(
[Mon Aug 03 23:46:19.812558 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 421, in _make_request
[Mon Aug 03 23:46:19.812565 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     six.raise_from(e, None)
[Mon Aug 03 23:46:19.812572 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "<string>", line 3, in raise_from
[Mon Aug 03 23:46:19.812579 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 416, in _make_request
[Mon Aug 03 23:46:19.812604 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     httplib_response = conn.getresponse()
[Mon Aug 03 23:46:19.812612 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3.8/http/client.py", line 1332, in getresponse
[Mon Aug 03 23:46:19.812619 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     response.begin()
[Mon Aug 03 23:46:19.812626 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3.8/http/client.py", line 303, in begin
[Mon Aug 03 23:46:19.812633 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     version, status, reason = self._read_status()
[Mon Aug 03 23:46:19.812640 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]   File "/usr/lib/python3.8/http/client.py", line 272, in _read_status
[Mon Aug 03 23:46:19.812646 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668]     raise RemoteDisconnected("Remote end closed connection without"
[Mon Aug 03 23:46:19.812654 2020] [wsgi:error] [pid 131365:tid 139651134682880] [remote 192.168.51.32:39668] urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

Pinging @openstack-charmers

Where are you getting your Dashboard credentials from?

juju run --unit keystone/0 leader-get admin_passwd

I’d suggest checking the deployment with the cli. Can you list networks for example ?

openstack network list

Do the entries in the catalogue look ok?

openstack endpoint list

If that looks ok, they have a look at the logs in /var/log/neutron on all the neutron-api units

To me looks like neutron-api is locking up everything.

openstack endpoint list returns all the correct endpoints.

openstack network list never completes

If I curl the endpoint urls externally or localhost on every neutron-api unit I eventually get

* Empty reply from server
* Connection #0 to host 127.0.0.1 left intact
curl: (52) Empty reply from server

I’ve been following this thread and based upon my learnings in there I deployed the ovn-chassis-2 (instead of ovn-chassis-1) in the hopes of being able to use bridge mappings to the bond instead of mac addresses. Anyone think this could be related?

Is there anything useful in the neutron-server.log on the neutron-api units ?

Can’t really say there is. This is about as interesting as it gets.

2020-08-04 12:18:05.670 241 WARNING keystonemiddleware.auth_token [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] AuthToken middleware is set with keystone_authtoken.service_token_roles_required set to False. This is backwards co
mpatible but deprecated behaviour. Please set this to True.
2020-08-04 12:18:05.674 241 WARNING oslo_config.cfg [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] Deprecated: Option "auth_uri" from group "keystone_authtoken" is deprecated. Use option "www_authenticate_uri" from group "keysto
ne_authtoken".
2020-08-04 12:18:05.680 241 INFO oslo_service.service [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] Starting 4 workers
2020-08-04 12:18:05.699 241 INFO neutron.service [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] Neutron service started, listening on 0.0.0.0:9676
2020-08-04 12:18:05.704 241 INFO oslo_service.service [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] Starting 4 workers
2020-08-04 12:18:05.730 241 INFO oslo_service.service [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] Starting 1 workers
2020-08-04 12:18:05.739 241 INFO oslo_service.service [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] Starting 1 workers
2020-08-04 12:18:05.742 1069 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-37221e84-c663-4987-ab32-f972f40c0ad6 - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry
2020-08-04 12:18:05.744 1074 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-076bc1a8-152a-478a-976d-4e017263b23e - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry
2020-08-04 12:18:05.748 241 INFO oslo_service.service [req-cd335139-b03c-4bd3-8e31-990ea810524f - - - - -] Starting 1 workers
2020-08-04 12:18:05.751 1067 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-0a94ca82-859e-459f-bba5-d8ee2a44efcb - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry
2020-08-04 12:18:05.754 1075 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-b3451d1e-4b98-420a-80bf-b72b2b19f79d - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry
2020-08-04 12:18:05.761 1104 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for MaintenanceWorker with retry
2020-08-04 12:18:05.774 241 WARNING oslo_config.cfg [-] Deprecated: Option "auth_uri" from group "keystone_authtoken" is deprecated for removal (The auth_uri option is deprecated in favor of www_authenticate_uri and will be removed in the S  release.).  Its value may be silently ignored in the future.
2020-08-04 12:18:05.781 1091 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-60382435-4b3c-4c08-8e93-68ee331ab87e - - - - -] Getting OvsdbNbOvnIdl for RpcWorker with retry
2020-08-04 12:18:05.781 1082 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-e45e4f11-f635-4fb7-8dcf-734941ae7eda - - - - -] Getting OvsdbNbOvnIdl for RpcWorker with retry
2020-08-04 12:18:05.785 1084 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-fcc97d8d-870c-4392-b802-5a2997b73c2d - - - - -] Getting OvsdbNbOvnIdl for RpcWorker with retry
2020-08-04 12:18:05.788 1102 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-ee59713c-0eb2-455c-830c-e8835a407efd - - - - -] Getting OvsdbNbOvnIdl for RpcReportsWorker with retry
2020-08-04 12:18:05.793 1095 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-99ad14b1-53ac-49dc-b39f-754124dbc883 - - - - -] Getting OvsdbNbOvnIdl for RpcWorker with retry
2020-08-04 12:18:05.827 1109 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-da8f66ef-60a2-4d07-8061-1ca12835b928 - - - - -] Getting OvsdbNbOvnIdl for AllServicesNeutronWorker with retry
2020-08-04 12:18:06.750 1069 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-37221e84-c663-4987-ab32-f972f40c0ad6 - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry
2020-08-04 12:18:06.750 1074 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-076bc1a8-152a-478a-976d-4e017263b23e - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry
2020-08-04 12:18:06.760 1075 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-b3451d1e-4b98-420a-80bf-b72b2b19f79d - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry
2020-08-04 12:18:06.760 1067 INFO neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-0a94ca82-859e-459f-bba5-d8ee2a44efcb - - - - -] Getting OvsdbNbOvnIdl for WorkerService with retry