User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-05-06 03:59:02 | 2019-05-06 07:28:16 | 2019-05-06 10:43:51 | 3:15:35 | ceph-deploy | master | mira | ba97031 | 32 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3931544 | 2019-05-06 03:59:24 | 2019-05-06 03:59:31 | 2019-05-06 05:07:31 | 1:08:00 | 0:20:45 | 0:47:15 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931545 | 2019-05-06 03:59:25 | 2019-05-06 03:59:32 | 2019-05-06 04:41:31 | 0:41:59 | 0:04:35 | 0:37:24 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira010 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931546 | 2019-05-06 03:59:26 | 2019-05-06 03:59:31 | 2019-05-06 04:43:30 | 0:43:59 | 0:20:48 | 0:23:11 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931547 | 2019-05-06 03:59:26 | 2019-05-06 03:59:32 | 2019-05-06 06:27:33 | 2:28:01 | 0:05:09 | 2:22:52 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira065 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931548 | 2019-05-06 03:59:27 | 2019-05-06 03:59:31 | 2019-05-06 04:31:30 | 0:31:59 | 0:08:27 | 0:23:32 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
ory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-06 04:30:05.354099', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'end': u'2019-05-06 04:30:05.806431', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha5', u'delta': u'0:00:00.012564', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-06 04:30:05.793867', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'end': u'2019-05-06 04:30:06.215921', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha1', u'delta': u'0:00:00.012192', u'stderr': u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-06 04:30:06.203729', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'end': u'2019-05-06 04:30:06.715053', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha2', u'delta': u'0:00:00.011669', u'stderr': u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-06 04:30:06.703384', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'end': u'2019-05-06 04:30:07.151496', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha5', u'delta': u'0:00:00.011694', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-06 04:30:07.139802', u'failed': True}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 218, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 190, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 28, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 217, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 101, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 67, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 249, in represent_undefined raise RepresenterError("cannot represent an object: %s" % data)RepresenterError: cannot represent an object: mpatha1 |
||||||||||||||
fail | 3931549 | 2019-05-06 03:59:28 | 2019-05-06 03:59:32 | 2019-05-06 04:51:31 | 0:51:59 | 0:05:31 | 0:46:28 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira016 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931550 | 2019-05-06 03:59:28 | 2019-05-06 03:59:31 | 2019-05-06 06:45:33 | 2:46:02 | 0:20:23 | 2:25:39 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931551 | 2019-05-06 03:59:29 | 2019-05-06 03:59:31 | 2019-05-06 04:21:31 | 0:22:00 | 0:05:36 | 0:16:24 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira010 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931552 | 2019-05-06 03:59:30 | 2019-05-06 03:59:32 | 2019-05-06 06:53:34 | 2:54:02 | 0:10:34 | 2:43:28 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira065 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931553 | 2019-05-06 03:59:31 | 2019-05-06 03:59:32 | 2019-05-06 06:03:33 | 2:04:01 | 0:10:32 | 1:53:29 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931554 | 2019-05-06 03:59:31 | 2019-05-06 04:21:45 | 2019-05-06 06:01:46 | 1:40:01 | 0:10:24 | 1:29:37 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931555 | 2019-05-06 03:59:32 | 2019-05-06 04:31:31 | 2019-05-06 05:09:31 | 0:38:00 | 0:10:56 | 0:27:04 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931556 | 2019-05-06 03:59:33 | 2019-05-06 04:41:45 | 2019-05-06 05:27:45 | 0:46:00 | 0:10:24 | 0:35:36 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931557 | 2019-05-06 03:59:33 | 2019-05-06 04:43:45 | 2019-05-06 05:33:44 | 0:49:59 | 0:10:06 | 0:39:53 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931558 | 2019-05-06 03:59:34 | 2019-05-06 04:51:46 | 2019-05-06 06:07:46 | 1:16:00 | 0:10:31 | 1:05:29 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931559 | 2019-05-06 03:59:35 | 2019-05-06 05:07:45 | 2019-05-06 05:37:45 | 0:30:00 | 0:12:47 | 0:17:13 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931560 | 2019-05-06 03:59:36 | 2019-05-06 05:09:46 | 2019-05-06 06:25:46 | 1:16:00 | 0:11:41 | 1:04:19 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira046 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931561 | 2019-05-06 03:59:36 | 2019-05-06 05:27:46 | 2019-05-06 07:03:47 | 1:36:01 | 0:10:40 | 1:25:21 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira010 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931562 | 2019-05-06 03:59:37 | 2019-05-06 05:33:45 | 2019-05-06 07:17:46 | 1:44:01 | 0:10:25 | 1:33:36 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931563 | 2019-05-06 03:59:38 | 2019-05-06 05:37:59 | 2019-05-06 07:48:00 | 2:10:01 | 0:10:22 | 1:59:39 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira010 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931564 | 2019-05-06 03:59:38 | 2019-05-06 06:01:47 | 2019-05-06 10:43:51 | 4:42:04 | 0:10:44 | 4:31:20 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931565 | 2019-05-06 03:59:39 | 2019-05-06 06:03:47 | 2019-05-06 07:11:47 | 1:08:00 | 0:10:32 | 0:57:28 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira041 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931566 | 2019-05-06 03:59:40 | 2019-05-06 06:08:00 | 2019-05-06 07:28:01 | 1:20:01 | 0:10:50 | 1:09:11 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931567 | 2019-05-06 03:59:40 | 2019-05-06 06:26:02 | 2019-05-06 08:02:02 | 1:36:00 | 0:10:31 | 1:25:29 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931568 | 2019-05-06 03:59:41 | 2019-05-06 06:27:47 | 2019-05-06 09:09:49 | 2:42:02 | 0:06:09 | 2:35:53 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931569 | 2019-05-06 03:59:42 | 2019-05-06 06:45:34 | 2019-05-06 08:49:35 | 2:04:01 | 0:21:02 | 1:42:59 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931570 | 2019-05-06 03:59:42 | 2019-05-06 06:53:35 | 2019-05-06 08:07:35 | 1:14:00 | 0:04:56 | 1:09:04 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira058 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931571 | 2019-05-06 03:59:43 | 2019-05-06 07:04:03 | 2019-05-06 08:52:03 | 1:48:00 | 0:21:32 | 1:26:28 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira058 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931572 | 2019-05-06 03:59:44 | 2019-05-06 07:12:01 | 2019-05-06 07:38:01 | 0:26:00 | 0:04:48 | 0:21:12 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931573 | 2019-05-06 03:59:45 | 2019-05-06 07:17:48 | 2019-05-06 08:13:48 | 0:56:00 | 0:20:54 | 0:35:06 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3931574 | 2019-05-06 03:59:45 | 2019-05-06 07:28:16 | 2019-05-06 10:06:17 | 2:38:01 | 0:05:31 | 2:32:30 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira010 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3931575 | 2019-05-06 03:59:46 | 2019-05-06 07:38:02 | 2019-05-06 10:38:04 | 3:00:02 | 0:21:29 | 2:38:33 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |