User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-05-04 03:59:02 | 2019-05-05 06:28:30 | 2019-05-05 19:59:02 | 13:30:32 | ceph-deploy | master | mira | bb4bceb | 31 | 1 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3925866 | 2019-05-04 03:59:23 | 2019-05-05 06:16:14 | 2019-05-05 13:10:20 | 6:54:06 | 0:24:27 | 6:29:39 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925867 | 2019-05-04 03:59:24 | 2019-05-05 06:26:30 | 2019-05-05 06:52:29 | 0:25:59 | 0:05:18 | 0:20:41 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925868 | 2019-05-04 03:59:24 | 2019-05-05 06:28:30 | 2019-05-05 07:36:30 | 1:08:00 | 0:20:57 | 0:47:03 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925869 | 2019-05-04 03:59:25 | 2019-05-05 06:34:16 | 2019-05-05 07:34:16 | 1:00:00 | 0:04:54 | 0:55:06 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925870 | 2019-05-04 03:59:26 | 2019-05-05 06:42:23 | 2019-05-05 07:52:23 | 1:10:00 | 0:21:29 | 0:48:31 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira065 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925871 | 2019-05-04 03:59:26 | 2019-05-05 06:48:31 | 2019-05-05 07:56:32 | 1:08:01 | 0:05:25 | 1:02:36 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925872 | 2019-05-04 03:59:27 | 2019-05-05 06:52:32 | 2019-05-05 18:50:43 | 11:58:11 | 0:08:19 | 11:49:52 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
ory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 18:48:55.011719', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'end': u'2019-05-05 18:48:55.445705', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha5', u'delta': u'0:00:00.011697', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 18:48:55.434008', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'end': u'2019-05-05 18:48:55.945463', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha1', u'delta': u'0:00:00.011513', u'stderr': u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 18:48:55.933950', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'end': u'2019-05-05 18:48:56.347475', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha2', u'delta': u'0:00:00.011249', u'stderr': u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 18:48:56.336226', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'end': u'2019-05-05 18:48:56.794985', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha5', u'delta': u'0:00:00.010057', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 18:48:56.784928', u'failed': True}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 218, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 190, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 28, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 217, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 101, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 67, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 249, in represent_undefined raise RepresenterError("cannot represent an object: %s" % data)RepresenterError: cannot represent an object: mpatha1 |
||||||||||||||
fail | 3925873 | 2019-05-04 03:59:28 | 2019-05-05 06:52:32 | 2019-05-05 09:36:34 | 2:44:02 | 0:05:22 | 2:38:40 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925874 | 2019-05-04 03:59:28 | 2019-05-05 06:56:24 | 2019-05-05 08:26:24 | 1:30:00 | 0:10:55 | 1:19:05 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira010 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925875 | 2019-05-04 03:59:29 | 2019-05-05 07:00:21 | 2019-05-05 08:36:21 | 1:36:00 | 0:10:59 | 1:25:01 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925876 | 2019-05-04 03:59:30 | 2019-05-05 07:16:05 | 2019-05-05 08:12:09 | 0:56:04 | 0:10:37 | 0:45:27 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925877 | 2019-05-04 03:59:30 | 2019-05-05 07:34:33 | 2019-05-05 08:50:34 | 1:16:01 | 0:11:14 | 1:04:47 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira046 with status 1: 'sudo tar cz -f /tmp/tmpP3TCUP -C /var/lib/ceph/mon -- .' |
||||||||||||||
fail | 3925878 | 2019-05-04 03:59:31 | 2019-05-05 07:36:35 | 2019-05-05 09:16:35 | 1:40:00 | 0:10:47 | 1:29:13 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925879 | 2019-05-04 03:59:32 | 2019-05-05 07:52:28 | 2019-05-05 08:14:27 | 0:21:59 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |||
Failure Reason:
Could not reconnect to ubuntu@mira034.front.sepia.ceph.com |
||||||||||||||
dead | 3925880 | 2019-05-04 03:59:32 | 2019-05-05 07:56:36 | 2019-05-05 19:59:02 | 12:02:26 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |||
fail | 3925881 | 2019-05-04 03:59:33 | 2019-05-05 08:12:16 | 2019-05-05 09:30:16 | 1:18:00 | 0:12:05 | 1:05:55 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira071 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925882 | 2019-05-04 03:59:34 | 2019-05-05 08:14:50 | 2019-05-05 09:02:49 | 0:47:59 | 0:11:56 | 0:36:03 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925883 | 2019-05-04 03:59:34 | 2019-05-05 08:26:42 | 2019-05-05 10:50:43 | 2:24:01 | 0:10:44 | 2:13:17 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira041 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925884 | 2019-05-04 03:59:35 | 2019-05-05 08:36:38 | 2019-05-05 10:00:38 | 1:24:00 | 0:10:18 | 1:13:42 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925885 | 2019-05-04 03:59:36 | 2019-05-05 08:50:48 | 2019-05-05 15:22:53 | 6:32:05 | 0:08:32 | 6:23:33 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
ory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 15:20:53.772130', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'end': u'2019-05-05 15:20:54.265785', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha1', u'delta': u'0:00:00.012226', u'stderr': u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 15:20:54.253559', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'end': u'2019-05-05 15:20:54.715976', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha2', u'delta': u'0:00:00.011918', u'stderr': u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 15:20:54.704058', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'end': u'2019-05-05 15:20:55.263776', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha1', u'delta': u'0:00:00.011447', u'stderr': u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 15:20:55.252329', u'failed': True}, {'_ansible_parsed': True, 'stderr_lines': [u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory'], u'cmd': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'end': u'2019-05-05 15:20:55.711170', '_ansible_no_log': False, u'stdout': u'', '_ansible_item_result': True, u'changed': True, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u'wipefs --force --all /dev/mpatha2 || wipefs --all /dev/mpatha2', u'removes': None, u'warn': True, u'chdir': None, u'stdin': None}}, 'item': u'mpatha2', u'delta': u'0:00:00.010915', u'stderr': u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code', 'stdout_lines': [], u'start': u'2019-05-05 15:20:55.700255', u'failed': True}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 218, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 190, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 28, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 217, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 101, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 67, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 249, in represent_undefined raise RepresenterError("cannot represent an object: %s" % data)RepresenterError: cannot represent an object: mpatha1 |
||||||||||||||
fail | 3925886 | 2019-05-04 03:59:36 | 2019-05-05 09:02:57 | 2019-05-05 11:18:58 | 2:16:01 | 0:10:42 | 2:05:19 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925887 | 2019-05-04 03:59:37 | 2019-05-05 09:16:51 | 2019-05-05 10:04:51 | 0:48:00 | 0:10:31 | 0:37:29 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira046 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925888 | 2019-05-04 03:59:38 | 2019-05-05 09:30:31 | 2019-05-05 15:48:37 | 6:18:06 | 0:10:43 | 6:07:23 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira071 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925889 | 2019-05-04 03:59:38 | 2019-05-05 09:36:49 | 2019-05-05 10:34:49 | 0:58:00 | 0:10:36 | 0:47:24 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925890 | 2019-05-04 03:59:39 | 2019-05-05 10:00:57 | 2019-05-05 10:24:56 | 0:23:59 | 0:04:47 | 0:19:12 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira010 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925891 | 2019-05-04 03:59:40 | 2019-05-05 10:04:54 | 2019-05-05 12:24:56 | 2:20:02 | 0:20:48 | 1:59:14 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925892 | 2019-05-04 03:59:40 | 2019-05-05 10:25:00 | 2019-05-05 10:53:00 | 0:28:00 | 0:04:59 | 0:23:01 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925893 | 2019-05-04 03:59:41 | 2019-05-05 10:35:04 | 2019-05-05 11:27:04 | 0:52:00 | 0:21:12 | 0:30:48 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925894 | 2019-05-04 03:59:42 | 2019-05-05 10:50:59 | 2019-05-05 11:36:58 | 0:45:59 | 0:04:45 | 0:41:14 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925895 | 2019-05-04 03:59:43 | 2019-05-05 10:53:14 | 2019-05-05 12:59:15 | 2:06:01 | 0:20:56 | 1:45:05 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira046 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 3925896 | 2019-05-04 03:59:43 | 2019-05-05 11:19:11 | 2019-05-05 11:45:11 | 0:26:00 | 0:04:39 | 0:21:21 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira046 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 3925897 | 2019-05-04 03:59:44 | 2019-05-05 11:27:18 | 2019-05-05 12:23:18 | 0:56:00 | 0:21:51 | 0:34:09 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira046 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |