Name | Machine Type | Up | Locked | Locked Since | Locked By | OS Type | OS Version | Arch | Description |
---|---|---|---|---|---|---|---|---|---|
mira065.front.sepia.ceph.com | mira | True | True | 2022-11-09 15:14:24.292740 | akraitma@aklap | x86_64 | ***Sepia-cobbler*** |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 6179506 |
![]() ![]() |
2021-06-18 15:30:57 | 2021-06-21 16:02:31 | 2021-06-21 16:44:23 | 0:41:52 | 0:09:34 | 0:32:18 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 |
Failure Reason:
Command failed on mira065 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
dead | 6179503 |
![]() |
2021-06-18 15:30:54 | 2021-06-21 15:47:57 | 2021-06-21 16:05:16 | 0:17:19 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 | ||
Failure Reason:
Error reimaging machines: reached maximum tries (100) after waiting for 600 seconds |
||||||||||||||
fail | 6179483 |
![]() ![]() |
2021-06-18 15:30:35 | 2021-06-18 16:10:19 | 2021-06-18 16:58:26 | 0:48:07 | 0:09:47 | 0:38:20 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 |
Failure Reason:
Command failed on mira047 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6179478 |
![]() ![]() |
2021-06-18 15:30:30 | 2021-06-18 15:30:31 | 2021-06-18 16:15:30 | 0:44:59 | 0:29:48 | 0:15:11 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
dead | 6124784 |
![]() |
2021-05-20 10:15:16 | 2021-05-20 10:15:18 | 2021-05-20 10:30:22 | 0:15:04 | mira | master | rhel | 8.3 | rados:cephadm/smoke/{distro/rhel_8.3_kubic_stable fixed-2 mon_election/connectivity start} | 2 | ||
Failure Reason:
Error reimaging machines: reached maximum tries (60) after waiting for 900 seconds |
||||||||||||||
dead | 6124779 |
![]() |
2021-05-20 10:07:50 | 2021-05-20 10:08:27 | 2021-05-20 10:08:28 | 0:00:01 | mira | master | centos | 8.2 | rados:cephadm/thrash/{0-distro/centos_8.2_kubic_stable 1-start 2-thrash 3-tasks/rados_api_tests fixed-2 msgr/async-v1only root} | 2 | ||
Failure Reason:
Error reimaging machines: Could not find an image for centos 8.2 |
||||||||||||||
dead | 6124775 |
![]() |
2021-05-20 10:07:47 | 2021-05-20 10:08:25 | 2021-05-20 10:08:27 | 0:00:02 | mira | master | centos | 8.2 | rados:cephadm/dashboard/{0-distro/centos_8.2_kubic_stable task/test_e2e} | 2 | ||
Failure Reason:
Error reimaging machines: Could not find an image for centos 8.2 |
||||||||||||||
fail | 6092034 |
![]() ![]() |
2021-05-03 05:55:42 | 2021-05-04 00:49:56 | 2021-05-04 01:12:23 | 0:22:27 | 0:04:57 | 0:17:30 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 |
Failure Reason:
Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
dead | 6092031 |
![]() |
2021-05-03 05:55:40 | 2021-05-03 23:02:50 | 2021-05-03 23:56:03 | 0:53:13 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | ||
Failure Reason:
Error reimaging machines: Failed to power on mira053 |
||||||||||||||
dead | 6092029 |
![]() ![]() |
2021-05-03 05:55:38 | 2021-05-03 22:57:58 | 2021-05-03 23:15:49 | 0:17:51 | 0:04:14 | 0:13:37 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} | 4 |
Failure Reason:
{'Failure object was': {'mira071.front.sepia.ceph.com': {'results': [{'changed': True, 'end': '2021-05-03 23:13:07.134453', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.029082', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.105371', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.315299', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.028260', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.287039', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.500098', 'stdout': ' Labels on physical volume "/dev/sdd" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdd', 'rc': 0, 'start': '2021-05-03 23:13:07.467908', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n WARNING: PV /dev/sdd is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdd of volume group "vg_hdd".', 'delta': '0:00:00.032190', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdd', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdd" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' WARNING: PV /dev/sdd is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdd of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdd', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdd'}, {'changed': True, 'end': '2021-05-03 23:13:07.679791', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026735', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.653056', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.852945', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.024927', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.828018', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:08.030586', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026947', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:08.003639', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:08.213063', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026562', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:08.186501', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}], 'changed': True, 'msg': 'All items completed'}}, 'Traceback (most recent call last)': 'File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 306, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 278, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 27, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 199, in represent_list return self.represent_sequence(\'tag:yaml.org,2002:seq\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 92, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 231, in represent_undefined raise RepresenterError("cannot represent an object", data)', 'yaml.representer.RepresenterError': "('cannot represent an object', '[unknown]')"} |
||||||||||||||
fail | 6092024 |
![]() ![]() |
2021-05-03 05:55:34 | 2021-05-03 06:37:58 | 2021-05-03 07:52:04 | 1:14:06 | 0:09:10 | 1:04:56 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 |
Failure Reason:
Command failed on mira026 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6092023 |
![]() ![]() |
2021-05-03 05:55:34 | 2021-05-03 06:33:07 | 2021-05-03 07:17:38 | 0:44:31 | 0:27:30 | 0:17:01 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6092021 |
![]() ![]() |
2021-05-03 05:55:32 | 2021-05-03 05:56:12 | 2021-05-03 06:37:54 | 0:41:42 | 0:28:19 | 0:13:23 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6074872 |
![]() ![]() |
2021-04-26 05:55:52 | 2021-04-26 10:02:17 | 2021-04-26 10:53:04 | 0:50:47 | 0:27:56 | 0:22:51 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6074867 |
![]() ![]() |
2021-04-26 05:55:48 | 2021-04-26 09:03:00 | 2021-04-26 10:12:23 | 1:09:23 | 0:37:36 | 0:31:47 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6074861 |
![]() ![]() |
2021-04-26 05:55:43 | 2021-04-26 08:09:13 | 2021-04-26 09:17:52 | 1:08:39 | 0:36:47 | 0:31:52 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_3 tasks/rbd_import_export} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6074859 |
![]() ![]() |
2021-04-26 05:55:42 | 2021-04-26 07:55:53 | 2021-04-26 08:22:47 | 0:26:54 | 0:05:09 | 0:21:45 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 |
Failure Reason:
Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6074852 |
![]() ![]() |
2021-04-26 05:55:36 | 2021-04-26 06:37:54 | 2021-04-26 08:05:11 | 1:27:17 | 0:36:57 | 0:50:20 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |
||||||||||||||
fail | 6074850 |
![]() ![]() |
2021-04-26 05:55:34 | 2021-04-26 06:33:30 | 2021-04-26 07:10:01 | 0:36:31 | 0:09:24 | 0:27:07 | mira | master | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/rbd_import_export} | 4 |
Failure Reason:
Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2' |
||||||||||||||
fail | 6074845 |
![]() ![]() |
2021-04-26 05:55:29 | 2021-04-26 05:55:36 | 2021-04-26 06:35:25 | 0:39:49 | 0:27:46 | 0:12:03 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} | 4 |
Failure Reason:
ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes |