User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-08-02 05:55:03 | 2019-08-02 05:55:43 | 2019-08-02 09:58:26 | 4:02:43 | ceph-deploy | nautilus | mira | 59177f7 | 32 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 4174211 | 2019-08-02 05:55:25 | 2019-08-02 05:55:43 | 2019-08-02 06:55:42 | 0:59:59 | 0:27:07 | 0:32:52 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174213 | 2019-08-02 05:55:26 | 2019-08-02 05:55:43 | 2019-08-02 06:23:42 | 0:27:59 | 0:09:27 | 0:18:32 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira100 with status 1: 'sudo tar cz -f /tmp/tmpGzt4ui -C /var/lib/ceph/mon -- .' |
||||||||||||||
fail | 4174215 | 2019-08-02 05:55:26 | 2019-08-02 05:55:43 | 2019-08-02 06:51:43 | 0:56:00 | 0:23:24 | 0:32:36 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira082 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174217 | 2019-08-02 05:55:27 | 2019-08-02 05:55:43 | 2019-08-02 06:19:42 | 0:23:59 | 0:10:56 | 0:13:03 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira032 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174219 | 2019-08-02 05:55:28 | 2019-08-02 05:55:43 | 2019-08-02 06:47:43 | 0:52:00 | 0:23:03 | 0:28:57 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira034 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174221 | 2019-08-02 05:55:28 | 2019-08-02 05:55:43 | 2019-08-02 06:17:42 | 0:21:59 | 0:09:28 | 0:12:31 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira066 with status 1: 'sudo tar cz -f /tmp/tmpHZjAWn -C /var/lib/ceph/mon -- .' |
||||||||||||||
fail | 4174222 | 2019-08-02 05:55:29 | 2019-08-02 06:18:00 | 2019-08-02 07:20:00 | 1:02:00 | 0:33:29 | 0:28:31 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira063 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174224 | 2019-08-02 05:55:30 | 2019-08-02 06:23:59 | 2019-08-02 07:02:03 | 0:38:04 | 0:11:41 | 0:26:23 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira032 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174227 | 2019-08-02 05:55:31 | 2019-08-02 06:38:02 | 2019-08-02 07:36:01 | 0:57:59 | 0:25:32 | 0:32:27 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174229 | 2019-08-02 05:55:31 | 2019-08-02 06:47:50 | 2019-08-02 07:13:49 | 0:25:59 | 0:10:55 | 0:15:04 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira077 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174231 | 2019-08-02 05:55:32 | 2019-08-02 06:55:59 | 2019-08-02 07:51:59 | 0:56:00 | 0:25:39 | 0:30:21 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira059 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174233 | 2019-08-02 05:55:33 | 2019-08-02 07:01:54 | 2019-08-02 07:25:53 | 0:23:59 | 0:11:08 | 0:12:51 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira032 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174235 | 2019-08-02 05:55:34 | 2019-08-02 07:08:17 | 2019-08-02 08:10:17 | 1:02:00 | 0:21:50 | 0:40:10 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira063 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174237 | 2019-08-02 05:55:34 | 2019-08-02 07:14:00 | 2019-08-02 07:48:00 | 0:34:00 | 0:10:54 | 0:23:06 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira032 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174239 | 2019-08-02 05:55:35 | 2019-08-02 07:26:12 | 2019-08-02 08:16:11 | 0:49:59 | 0:21:40 | 0:28:19 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174240 | 2019-08-02 05:55:36 | 2019-08-02 07:31:57 | 2019-08-02 07:59:56 | 0:27:59 | 0:12:06 | 0:15:53 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed to create osds |
||||||||||||||
fail | 4174242 | 2019-08-02 05:55:36 | 2019-08-02 07:38:09 | 2019-08-02 08:10:08 | 0:31:59 | 0:11:07 | 0:20:52 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira032 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174244 | 2019-08-02 05:55:37 | 2019-08-02 07:40:12 | 2019-08-02 08:32:11 | 0:51:59 | 0:22:22 | 0:29:37 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174246 | 2019-08-02 05:55:38 | 2019-08-02 07:52:02 | 2019-08-02 08:16:01 | 0:23:59 | 0:11:25 | 0:12:34 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira059 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174248 | 2019-08-02 05:55:39 | 2019-08-02 07:58:21 | 2019-08-02 09:02:20 | 1:03:59 | 0:25:56 | 0:38:03 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira063 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174250 | 2019-08-02 05:55:39 | 2019-08-02 08:06:26 | 2019-08-02 08:32:25 | 0:25:59 | 0:10:59 | 0:15:00 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira002 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174252 | 2019-08-02 05:55:40 | 2019-08-02 08:10:25 | 2019-08-02 08:54:24 | 0:43:59 | 0:22:16 | 0:21:43 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira072 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174254 | 2019-08-02 05:55:41 | 2019-08-02 08:16:05 | 2019-08-02 08:54:05 | 0:38:00 | 0:10:35 | 0:27:25 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira032 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174256 | 2019-08-02 05:55:42 | 2019-08-02 08:22:23 | 2019-08-02 08:52:23 | 0:30:00 | 0:08:25 | 0:21:35 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
a2 || wipefs --all /dev/mpatha2', 'item': u'mpatha2', u'stderr': u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code'}, {'stderr_lines': [u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory'], u'changed': True, u'stdout': u'', u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'strip_empty_ends': True, u'_raw_params': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin_add_newline': True, u'stdin': None}}, u'delta': u'0:00:00.012139', 'stdout_lines': [], '_ansible_item_label': u'mpatha5', 'ansible_loop_var': u'item', u'end': u'2019-08-02 08:49:46.885219', '_ansible_no_log': False, u'start': u'2019-08-02 08:49:46.873080', u'failed': True, u'cmd': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', 'item': u'mpatha5', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code'}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'mpatha1') |
||||||||||||||
fail | 4174258 | 2019-08-02 05:55:43 | 2019-08-02 08:28:28 | 2019-08-02 08:54:27 | 0:25:59 | 0:11:05 | 0:14:54 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira030 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174260 | 2019-08-02 05:55:44 | 2019-08-02 08:32:28 | 2019-08-02 09:40:28 | 1:08:00 | 0:22:03 | 0:45:57 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174262 | 2019-08-02 05:55:45 | 2019-08-02 08:52:22 | 2019-08-02 09:16:21 | 0:23:59 | 0:10:51 | 0:13:08 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira072 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174264 | 2019-08-02 05:55:46 | 2019-08-02 08:54:22 | 2019-08-02 09:32:21 | 0:37:59 | 0:21:37 | 0:16:22 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira002 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174266 | 2019-08-02 05:55:46 | 2019-08-02 08:54:28 | 2019-08-02 09:24:28 | 0:30:00 | 0:11:15 | 0:18:45 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira059 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174268 | 2019-08-02 05:55:47 | 2019-08-02 09:02:27 | 2019-08-02 09:58:26 | 0:55:59 | 0:20:57 | 0:35:02 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira072 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174270 | 2019-08-02 05:55:48 | 2019-08-02 09:12:26 | 2019-08-02 09:42:25 | 0:29:59 | 0:10:10 | 0:19:49 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4174272 | 2019-08-02 05:55:48 | 2019-08-02 09:16:24 | 2019-08-02 09:56:22 | 0:39:58 | 0:08:29 | 0:31:29 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
a5 || wipefs --all /dev/mpatha5', 'item': u'mpatha5', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code'}, {'stderr_lines': [], u'changed': True, u'stdout': u'/dev/sdc1: 6 bytes were erased at offset 0x00000000 (crypto_LUKS): 4c 55 4b 53 ba be', u'delta': u'0:00:00.067091', 'stdout_lines': [u'/dev/sdc1: 6 bytes were erased at offset 0x00000000 (crypto_LUKS): 4c 55 4b 53 ba be'], '_ansible_item_label': u'sdc1', 'ansible_loop_var': u'item', u'end': u'2019-08-02 09:55:08.771986', '_ansible_no_log': False, 'item': u'sdc1', u'cmd': u'wipefs --force --all /dev/sdc1 || wipefs --all /dev/sdc1', 'failed': False, u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'strip_empty_ends': True, u'_raw_params': u'wipefs --force --all /dev/sdc1 || wipefs --all /dev/sdc1', u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin_add_newline': True, u'stdin': None}}, u'start': u'2019-08-02 09:55:08.704895'}, {'stderr_lines': [u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory'], u'changed': True, u'stdout': u'', u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'strip_empty_ends': True, u'_raw_params': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin_add_newline': True, u'stdin': None}}, u'delta': u'0:00:00.011341', 'stdout_lines': [], '_ansible_item_label': u'mpatha1', 'ansible_loop_var': u'item', u'end': u'2019-08-02 09:55:09.225362', '_ansible_no_log': False, u'start': u'2019-08-02 09:55:09.214021', u'failed': True, u'cmd': u'wipefs --force --all /dev/mpatha1 || wipefs --all /dev/mpatha1', 'item': u'mpatha1', u'stderr': u'wipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha1: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code'}, {'stderr_lines': [u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory'], u'changed': True, u'stdout': u'', u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'strip_empty_ends': True, u'_raw_params': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin_add_newline': True, u'stdin': None}}, u'delta': u'0:00:00.010357', 'stdout_lines': [], '_ansible_item_label': u'mpatha5', 'ansible_loop_var': u'item', u'end': u'2019-08-02 09:55:09.678318', '_ansible_no_log': False, u'start': u'2019-08-02 09:55:09.667961', u'failed': True, u'cmd': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', 'item': u'mpatha5', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code'}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'mpatha1') |