Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
fail 6092018 2021-05-03 05:55:29 2021-05-03 05:56:11 2021-05-03 06:30:43 0:34:32 0:09:14 0:25:18 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Command failed on mira026 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2'

fail 6092019 2021-05-03 05:55:30 2021-05-03 05:56:11 2021-05-03 06:35:55 0:39:44 0:27:49 0:11:55 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes

fail 6092020 2021-05-03 05:55:31 2021-05-03 05:56:12 2021-05-03 06:33:02 0:36:50 0:09:01 0:27:49 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2'

fail 6092021 2021-05-03 05:55:32 2021-05-03 05:56:12 2021-05-03 06:37:54 0:41:42 0:28:19 0:13:23 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes

fail 6092022 2021-05-03 05:55:33 2021-05-03 05:56:12 2021-05-03 07:04:47 1:08:35 0:09:21 0:59:14 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Command failed on mira026 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2'

fail 6092023 2021-05-03 05:55:34 2021-05-03 06:33:07 2021-05-03 07:17:38 0:44:31 0:27:30 0:17:01 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} 4
Failure Reason:

ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes

fail 6092024 2021-05-03 05:55:34 2021-05-03 06:37:58 2021-05-03 07:52:04 1:14:06 0:09:10 1:04:56 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Command failed on mira026 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2'

fail 6092025 2021-05-03 05:55:35 2021-05-03 07:17:42 2021-05-03 17:50:56 10:33:14 0:30:12 10:03:02 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} 4
Failure Reason:

ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes

dead 6092026 2021-05-03 05:55:36 2021-05-03 17:07:00 2021-05-03 17:07:00 0:00:00 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands}
dead 6092027 2021-05-03 05:55:37 2021-05-03 22:57:57 2021-05-03 23:02:28 0:04:31 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

fail 6092028 2021-05-03 05:55:38 2021-05-03 22:57:57 2021-05-03 23:51:25 0:53:28 0:36:10 0:17:18 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands} 4
Failure Reason:

ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes

dead 6092029 2021-05-03 05:55:38 2021-05-03 22:57:58 2021-05-03 23:15:49 0:17:51 0:04:14 0:13:37 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

{'Failure object was': {'mira071.front.sepia.ceph.com': {'results': [{'changed': True, 'end': '2021-05-03 23:13:07.134453', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.029082', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.105371', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.315299', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.028260', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.287039', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.500098', 'stdout': ' Labels on physical volume "/dev/sdd" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdd', 'rc': 0, 'start': '2021-05-03 23:13:07.467908', 'stderr': ' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.\\n WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.\\n WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.\\n WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.\\n WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.\\n WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.\\n WARNING: PV /dev/sdd is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdd of volume group "vg_hdd".', 'delta': '0:00:00.032190', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdd', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdd" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV Ekk7EP-d9oK-VFp1-vS2d-pcQd-tFBw-2hMrJj not found or rejected by a filter.', ' WARNING: Device for PV 2uuAlB-SzWj-CCD9-AICo-wlZR-hINy-VQKTiD not found or rejected by a filter.', ' WARNING: Device for PV WWayiY-WNvc-qZft-ImrM-p0fH-Y0AN-DrFhRz not found or rejected by a filter.', ' WARNING: Device for PV P0m7Y8-nUJc-73Zl-B96x-dELA-d1Ns-BtS4rU not found or rejected by a filter.', ' WARNING: Device for PV juhyW8-nT5d-tei8-TByi-5zZJ-OI2N-eLohkf not found or rejected by a filter.', ' WARNING: Device for PV 1fvAh3-BgeD-sjsL-Y9nc-rjUf-GOXt-wNGjWL not found or rejected by a filter.', ' WARNING: PV /dev/sdd is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdd of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdd', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdd'}, {'changed': True, 'end': '2021-05-03 23:13:07.679791', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026735', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.653056', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:07.852945', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.024927', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:07.828018', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:08.030586', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026947', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:08.003639', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2021-05-03 23:13:08.213063', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026562', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2021-05-03 23:13:08.186501', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}], 'changed': True, 'msg': 'All items completed'}}, 'Traceback (most recent call last)': 'File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 306, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 278, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 27, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 199, in represent_list return self.represent_sequence(\'tag:yaml.org,2002:seq\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 92, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_19220a3bd6e252c6e8260827019668a766d85490/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 231, in represent_undefined raise RepresenterError("cannot represent an object", data)', 'yaml.representer.RepresenterError': "('cannot represent an object', '[unknown]')"}

fail 6092030 2021-05-03 05:55:39 2021-05-03 22:57:58 2021-05-03 23:56:24 0:58:26 0:36:14 0:22:12 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes

dead 6092031 2021-05-03 05:55:40 2021-05-03 23:02:50 2021-05-03 23:56:03 0:53:13 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

fail 6092032 2021-05-03 05:55:41 2021-05-03 23:51:35 2021-05-04 00:49:52 0:58:17 0:36:44 0:21:33 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

ceph health was unable to get 'HEALTH_OK' after waiting 15 minutes

dead 6092033 2021-05-03 05:55:42 2021-05-03 23:56:27 2021-05-04 00:54:23 0:57:56 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

fail 6092034 2021-05-03 05:55:42 2021-05-04 00:49:56 2021-05-04 01:12:23 0:22:27 0:04:57 0:17:30 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap 2'

dead 6092035 2021-05-03 05:55:43 2021-05-04 00:54:36 2021-05-04 01:16:57 0:22:21 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092036 2021-05-03 05:55:44 2021-05-04 01:12:30 2021-05-04 01:23:38 0:11:08 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/ubuntu_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092037 2021-05-03 05:55:45 2021-05-04 01:19:11 2021-05-04 01:30:19 0:11:08 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092038 2021-05-03 05:55:45 2021-05-04 01:25:52 2021-05-04 01:37:00 0:11:08 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092039 2021-05-03 05:55:46 2021-05-04 01:32:33 2021-05-04 01:43:41 0:11:08 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092040 2021-05-03 05:55:47 2021-05-04 01:39:14 2021-05-04 01:50:22 0:11:08 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/ubuntu_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092041 2021-05-03 05:55:48 2021-05-04 01:45:55 2021-05-04 01:57:13 0:11:18 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_3 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092042 2021-05-03 05:55:48 2021-05-04 01:52:46 2021-05-04 02:03:54 0:11:08 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092043 2021-05-03 05:55:49 2021-05-04 01:59:27 2021-05-04 02:10:35 0:11:08 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092044 2021-05-03 05:55:50 2021-05-04 02:06:08 2021-05-04 02:17:17 0:11:09 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/ubuntu_latest python_versions/python_3 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092045 2021-05-03 05:55:51 2021-05-04 02:12:49 2021-05-04 02:23:57 0:11:08 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092046 2021-05-03 05:55:52 2021-05-04 02:19:30 2021-05-04 02:30:39 0:11:09 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092047 2021-05-03 05:55:52 2021-05-04 02:26:11 2021-05-04 02:37:20 0:11:09 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092048 2021-05-03 05:55:53 2021-05-04 02:32:52 2021-05-04 11:21:03 8:48:11 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053

dead 6092049 2021-05-03 05:55:54 2021-05-04 11:16:35 2021-05-04 11:27:43 0:11:08 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_filestore distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} 4
Failure Reason:

Error reimaging machines: Failed to power on mira053