Name | Machine Type | Up | Locked | Locked Since | Locked By | OS Type | OS Version | Arch | Description |
---|---|---|---|---|---|---|---|---|---|
mira077.front.sepia.ceph.com | mira | True | True | 2024-03-06 16:33:19.039673 | smanjara@teuthology | x86_64 | None |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 6179508 | 2021-06-18 15:30:59 | 2021-06-21 16:44:26 | 2021-06-21 17:19:06 | 0:34:40 | 0:09:45 | 0:24:55 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira047 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6179505 | 2021-06-18 15:30:56 | 2021-06-21 15:47:58 | 2021-06-21 16:44:29 | 0:56:31 | 0:28:11 | 0:28:20 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
dead | 6179502 | 2021-06-18 15:30:53 | 2021-06-19 17:51:06 | 2021-06-21 16:02:15 | 1 day, 22:11:09 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |||
Failure Reason:
Error reimaging machines: reached maximum tries (100) after waiting for 600 seconds |
||||||||||||||
dead | 6179492 | 2021-06-18 15:30:44 | 2021-06-19 14:48:12 | 2021-06-19 15:54:08 | 1:05:56 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |||
Failure Reason:
Error reimaging machines: Failed to power on mira053 |
||||||||||||||
dead | 6179482 | 2021-06-18 15:30:34 | 2021-06-18 15:59:07 | 2021-06-18 16:14:16 | 0:15:09 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 | |||
Failure Reason:
Error reimaging machines: Failed to power on mira053 |
||||||||||||||
fail | 6179477 | 2021-06-18 15:30:29 | 2021-06-18 15:30:30 | 2021-06-18 16:10:11 | 0:39:41 | 0:09:50 | 0:29:51 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira066 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
dead | 6124785 | 2021-05-20 10:15:17 | 2021-05-20 10:15:18 | 2021-05-20 10:30:22 | 0:15:04 | mira | master | rhel | 8.3 | rados:cephadm/smoke/{distro/rhel_8.3_kubic_stable fixed-2 mon_election/classic start} | 2 | |||
Failure Reason:
Error reimaging machines: reached maximum tries (60) after waiting for 900 seconds |
||||||||||||||
dead | 6124773 | 2021-05-20 10:07:45 | 2021-05-20 10:08:25 | 2021-05-20 10:08:26 | 0:00:01 | mira | master | centos | 8.2 | rados:cephadm/smoke-roleless/{0-distro/centos_8.2_kubic_stable 1-start 2-services/iscsi 3-final} | 2 | |||
Failure Reason:
Error reimaging machines: Could not find an image for centos 8.2 |
||||||||||||||
dead | 6092029 | 2021-05-03 05:55:38 | 2021-05-03 22:57:58 | 2021-05-03 23:15:49 | 0:17:51 | 0:04:14 | 0:13:37 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
{'Failure object was': {'mira071.front.sepia.ceph.com': {'results': [{'changed': True, 'end': '2021-05-03 23:13:07.134453', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.029082', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.105371', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.315299', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.028260', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.287039', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.500098', 'stdout': ' Labels on physical volume "/dev/sdd" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdd', 'rc': 0, 'start': '2021-05-03 23:13:07.467908', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n WARNING: PV /dev/sdd is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdd of volume group "vg_hdd".', 'delta': '0:00:00.032190', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdd', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdd" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' WARNING: PV /dev/sdd is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdd of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdd', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdd'}, {'changed': True, 'end': '2021-05-03 23:13:07.679791', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026735', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.653056', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.852945', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.024927', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.828018', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:08.030586', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026947', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:08.003639', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:08.213063', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026562', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:08.186501', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}], 'changed': True, 'msg': 'All items completed'}}, 'Traceback (most recent call last)': 'File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 306, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 278, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 27, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 199, in represent_list return self.represent_sequence(\'tag:yaml.org,2002:seq\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 92, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 231, in represent_undefined raise RepresenterError("cannot represent an object", data)', 'yaml.representer.RepresenterError': "('cannot represent an object', '[unknown]')"} |
||||||||||||||
fail | 6092020 | 2021-05-03 05:55:31 | 2021-05-03 05:56:12 | 2021-05-03 06:33:02 | 0:36:50 | 0:09:01 | 0:27:49 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6074873 | 2021-04-26 05:55:53 | 2021-04-26 10:12:25 | 2021-04-26 10:57:15 | 0:44:50 | 0:09:00 | 0:35:50 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6074869 | 2021-04-26 05:55:50 | 2021-04-26 09:21:25 | 2021-04-26 10:22:23 | 1:00:58 | 0:09:09 | 0:51:49 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6074863 | 2021-04-26 05:55:45 | 2021-04-26 08:27:04 | 2021-04-26 09:46:52 | 1:19:48 | 0:36:53 | 0:42:55 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6074856 | 2021-04-26 05:55:39 | 2021-04-26 07:19:53 | 2021-04-26 08:38:26 | 1:18:33 | 0:36:40 | 0:41:53 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6074853 | 2021-04-26 05:55:36 | 2021-04-26 07:10:06 | 2021-04-26 07:28:49 | 0:18:43 | 0:05:13 | 0:13:30 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6074848 | 2021-04-26 05:55:32 | 2021-04-26 05:55:37 | 2021-04-26 07:10:21 | 1:14:44 | 0:09:13 | 1:05:31 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6074844 | 2021-04-26 05:55:29 | 2021-04-26 05:55:35 | 2021-04-26 06:33:25 | 0:37:50 | 0:09:01 | 0:28:49 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6058638 | 2021-04-19 05:56:06 | 2021-04-19 10:33:37 | 2021-04-19 11:24:26 | 0:50:49 | 0:13:06 | 0:37:43 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6058632 | 2021-04-19 05:56:01 | 2021-04-19 09:20:02 | 2021-04-19 10:34:24 | 1:14:22 | 0:13:39 | 1:00:43 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6058626 | 2021-04-19 05:55:55 | 2021-04-19 08:29:52 | 2021-04-19 09:48:54 | 1:19:02 | 0:37:11 | 0:41:51 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_3 tasks/rbd_import_export} | 4 | |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |