Name | Machine Type | Up | Locked | Locked Since | Locked By | OS Type | OS Version | Arch | Description |
---|---|---|---|---|---|---|---|---|---|
mira100.front.sepia.ceph.com | mira | False | False | ubuntu | 18.04 | x86_64 | /home/teuthworker/archive/teuthology-2020-07-06_05:55:03-ceph-deploy-nautilus-distro-basic-mira/5202322 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pass | 5202322 | 2020-07-06 05:55:55 | 2020-07-06 07:53:50 | 2020-07-06 08:53:50 | 1:00:00 | 0:16:38 | 0:43:22 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
pass | 5202315 | 2020-07-06 05:55:49 | 2020-07-06 07:19:39 | 2020-07-06 08:25:40 | 1:06:01 | 0:37:12 | 0:28:49 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 | |
fail | 5202307 | 2020-07-06 05:55:42 | 2020-07-06 06:35:38 | 2020-07-06 06:55:37 | 0:19:59 | 0:04:17 | 0:15:42 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira066 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.p3EwXQtpF0' |
||||||||||||||
dead | 5202301 | 2020-07-06 05:55:37 | 2020-07-06 05:55:38 | 2020-07-06 06:39:38 | 0:44:00 | 0:03:19 | 0:40:41 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
{'Failure object was': {'mira016.front.sepia.ceph.com': {'results': [{'changed': True, 'end': '2020-07-06 06:38:27.507546', 'stdout': ' Labels on physical volume "/dev/sdb" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdb', 'rc': 0, 'start': '2020-07-06 06:38:27.476931', 'stderr': ' WARNING: Device for PV 9cyMAK-g1E2-0XKk-20Ze-Y2jD-TIzD-wQA2Jf not found or rejected by a filter.\\n WARNING: Device for PV ZB6zga-9Vgp-jfNm-9cXQ-0B6b-3cvV-6WcQYZ not found or rejected by a filter.\\n WARNING: Device for PV jYQJsF-4vf1-Onjn-YwGs-aJVs-uhwa-6HaaHn not found or rejected by a filter.\\n WARNING: Device for PV 5VmXqr-c3R6-WXNC-vCkv-Ysvq-2zLe-Y24xB2 not found or rejected by a filter.\\n WARNING: Device for PV tcwZfs-7x0a-VGSe-JYyU-M78d-fHZ1-cCub5p not found or rejected by a filter.\\n WARNING: PV /dev/sdb is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdb of volume group "vg_hdd".', 'delta': '0:00:00.030615', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdb', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdb" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV 9cyMAK-g1E2-0XKk-20Ze-Y2jD-TIzD-wQA2Jf not found or rejected by a filter.', ' WARNING: Device for PV ZB6zga-9Vgp-jfNm-9cXQ-0B6b-3cvV-6WcQYZ not found or rejected by a filter.', ' WARNING: Device for PV jYQJsF-4vf1-Onjn-YwGs-aJVs-uhwa-6HaaHn not found or rejected by a filter.', ' WARNING: Device for PV 5VmXqr-c3R6-WXNC-vCkv-Ysvq-2zLe-Y24xB2 not found or rejected by a filter.', ' WARNING: Device for PV tcwZfs-7x0a-VGSe-JYyU-M78d-fHZ1-cCub5p not found or rejected by a filter.', ' WARNING: PV /dev/sdb is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdb of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdb', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdb'}, {'changed': True, 'end': '2020-07-06 06:38:27.728377', 'stdout': ' Labels on physical volume "/dev/sdc" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdc', 'rc': 0, 'start': '2020-07-06 06:38:27.698871', 'stderr': ' WARNING: Device for PV MgPFl2-Vqki-ghs5-96xe-lgFt-wVkK-eGo1ZT not found or rejected by a filter.\\n WARNING: Device for PV 9cyMAK-g1E2-0XKk-20Ze-Y2jD-TIzD-wQA2Jf not found or rejected by a filter.\\n WARNING: Device for PV ZB6zga-9Vgp-jfNm-9cXQ-0B6b-3cvV-6WcQYZ not found or rejected by a filter.\\n WARNING: Device for PV jYQJsF-4vf1-Onjn-YwGs-aJVs-uhwa-6HaaHn not found or rejected by a filter.\\n WARNING: Device for PV 5VmXqr-c3R6-WXNC-vCkv-Ysvq-2zLe-Y24xB2 not found or rejected by a filter.\\n WARNING: Device for PV tcwZfs-7x0a-VGSe-JYyU-M78d-fHZ1-cCub5p not found or rejected by a filter.\\n WARNING: PV /dev/sdc is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdc of volume group "vg_hdd".', 'delta': '0:00:00.029506', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdc', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdc" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV MgPFl2-Vqki-ghs5-96xe-lgFt-wVkK-eGo1ZT not found or rejected by a filter.', ' WARNING: Device for PV 9cyMAK-g1E2-0XKk-20Ze-Y2jD-TIzD-wQA2Jf not found or rejected by a filter.', ' WARNING: Device for PV ZB6zga-9Vgp-jfNm-9cXQ-0B6b-3cvV-6WcQYZ not found or rejected by a filter.', ' WARNING: Device for PV jYQJsF-4vf1-Onjn-YwGs-aJVs-uhwa-6HaaHn not found or rejected by a filter.', ' WARNING: Device for PV 5VmXqr-c3R6-WXNC-vCkv-Ysvq-2zLe-Y24xB2 not found or rejected by a filter.', ' WARNING: Device for PV tcwZfs-7x0a-VGSe-JYyU-M78d-fHZ1-cCub5p not found or rejected by a filter.', ' WARNING: PV /dev/sdc is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdc of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdc', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdc'}, {'changed': True, 'end': '2020-07-06 06:38:27.920431', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.028876', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-07-06 06:38:27.891555', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-07-06 06:38:28.168324', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.028132', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-07-06 06:38:28.140192', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-07-06 06:38:28.357852', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.027411', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-07-06 06:38:28.330441', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-07-06 06:38:28.545150', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026576', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-07-06 06:38:28.518574', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-07-06 06:38:28.735078', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026096', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-07-06 06:38:28.708982', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}], 'changed': True, 'msg': 'All items completed'}}, 'Traceback (most recent call last)': 'File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 306, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 278, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 27, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 199, in represent_list return self.represent_sequence(\'tag:yaml.org,2002:seq\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 92, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 231, in represent_undefined raise RepresenterError("cannot represent an object", data)', 'yaml.representer.RepresenterError': "('cannot represent an object', '/dev/sdb')"} |
||||||||||||||
pass | 5202299 | 2020-07-06 05:55:35 | 2020-07-06 05:55:37 | 2020-07-06 07:23:39 | 1:28:02 | 0:16:39 | 1:11:23 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 | |
pass | 5202295 | 2020-07-06 05:55:31 | 2020-07-06 05:55:35 | 2020-07-06 06:25:34 | 0:29:59 | 0:17:15 | 0:12:44 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
fail | 5189059 | 2020-06-29 05:55:45 | 2020-06-29 07:32:05 | 2020-06-29 08:24:05 | 0:52:00 | 0:07:30 | 0:44:30 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira066 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.fuss9J2vev' |
||||||||||||||
pass | 5189049 | 2020-06-29 05:55:36 | 2020-06-29 06:43:58 | 2020-06-29 07:59:59 | 1:16:01 | 0:36:01 | 0:40:00 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
fail | 5189045 | 2020-06-29 05:55:33 | 2020-06-29 06:26:00 | 2020-06-29 07:00:00 | 0:34:00 | 0:04:07 | 0:29:53 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira066 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.0GRkYEEZcA' |
||||||||||||||
pass | 5189040 | 2020-06-29 05:55:29 | 2020-06-29 05:55:42 | 2020-06-29 06:43:42 | 0:48:00 | 0:22:09 | 0:25:51 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 | |
fail | 5170783 | 2020-06-22 05:55:51 | 2020-06-22 07:09:41 | 2020-06-22 08:11:42 | 1:02:01 | 0:07:48 | 0:54:13 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira101 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.swf6BXLjEj' |
||||||||||||||
pass | 5170774 | 2020-06-22 05:55:46 | 2020-06-22 06:39:40 | 2020-06-22 07:35:41 | 0:56:01 | 0:23:47 | 0:32:14 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
fail | 5170770 | 2020-06-22 05:55:43 | 2020-06-22 06:25:40 | 2020-06-22 06:47:39 | 0:21:59 | 0:04:12 | 0:17:47 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira059 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.7WYUnSmNnW' |
||||||||||||||
fail | 5170761 | 2020-06-22 05:55:35 | 2020-06-22 05:55:36 | 2020-06-22 06:31:36 | 0:36:00 | 0:07:46 | 0:28:14 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira100 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.sMCrJYlZm9' |
||||||||||||||
fail | 5158540 | 2020-06-17 19:37:50 | 2020-06-17 20:03:39 | 2020-06-17 20:39:39 | 0:36:00 | 0:07:52 | 0:28:08 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira059 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.uBSVa7iL0N' |
||||||||||||||
fail | 5158528 | 2020-06-17 19:37:36 | 2020-06-17 19:37:38 | 2020-06-17 20:15:38 | 0:38:00 | 0:07:37 | 0:30:23 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira100 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.0QVtHe6vKv' |
||||||||||||||
fail | 5157755 | 2020-06-17 13:09:18 | 2020-06-17 13:09:25 | 2020-06-17 13:29:24 | 0:19:59 | 0:03:52 | 0:16:07 | mira | master | ubuntu | 20.04 | upgrade:nautilus-x:stress-split/{0-cluster/{openstack.yaml start.yaml} 1-ceph-install/nautilus.yaml 1.1-pg-log-overrides/normal_pg_log.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{radosbench.yaml rbd-cls.yaml rbd-import-export.yaml rbd_api.yaml readwrite.yaml rgw_ragweed_prepare.yaml snaps-few-objects.yaml} 5-finish-upgrade.yaml 6-octopus.yaml 7-msgr2.yaml 8-final-workload/{rbd-python.yaml snaps-many-objects.yaml} objectstore/bluestore-bitmap.yaml thrashosds-health.yaml ubuntu_latest.yaml} | 5 | |
Failure Reason:
Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=ubuntu%2F20.04%2Fx86_64&ref=nautilus |
||||||||||||||
fail | 5157638 | 2020-06-17 12:40:29 | 2020-06-17 12:40:31 | 2020-06-17 13:00:30 | 0:19:59 | 0:04:15 | 0:15:44 | mira | master | ubuntu | 20.04 | upgrade:nautilus-x:stress-split/{0-cluster/{openstack.yaml start.yaml} 1-ceph-install/nautilus.yaml 1.1-pg-log-overrides/normal_pg_log.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{radosbench.yaml rbd-cls.yaml rbd-import-export.yaml rbd_api.yaml readwrite.yaml rgw_ragweed_prepare.yaml snaps-few-objects.yaml} 5-finish-upgrade.yaml 6-octopus.yaml 7-msgr2.yaml 8-final-workload/{rbd-python.yaml snaps-many-objects.yaml} objectstore/bluestore-bitmap.yaml thrashosds-health.yaml ubuntu_latest.yaml} | 5 | |
Failure Reason:
Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=ubuntu%2F20.04%2Fx86_64&ref=nautilus |
||||||||||||||
pass | 5156861 | 2020-06-17 05:10:25 | 2020-06-17 05:10:49 | 2020-06-17 05:36:49 | 0:26:00 | 0:08:58 | 0:17:02 | mira | py2 | centos | 7.4 | ceph-disk/basic/{distros/centos_latest tasks/ceph-detect-init} | 1 | |
fail | 5153640 | 2020-06-16 13:26:35 | 2020-06-16 13:26:42 | 2020-06-16 13:48:42 | 0:22:00 | 0:04:06 | 0:17:54 | mira | master | ubuntu | 20.04 | upgrade:nautilus-x:stress-split/{0-cluster/{openstack.yaml start.yaml} 1-ceph-install/nautilus.yaml 1.1-pg-log-overrides/short_pg_log.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{radosbench.yaml rbd-cls.yaml rbd-import-export.yaml rbd_api.yaml readwrite.yaml rgw_ragweed_prepare.yaml snaps-few-objects.yaml} 5-finish-upgrade.yaml 6-octopus.yaml 7-msgr2.yaml 8-final-workload/{rbd-python.yaml snaps-many-objects.yaml} objectstore/filestore-xfs.yaml thrashosds-health.yaml ubuntu_latest.yaml} | 5 | |
Failure Reason:
Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=ubuntu%2F20.04%2Fx86_64&ref=nautilus |