User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-07-08 03:59:03 | 2019-07-08 09:49:54 | 2019-07-09 02:25:21 | 16:35:27 | ceph-deploy | master | mira | b701073 | 25 | 7 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 4102074 | 2019-07-08 03:59:35 | 2019-07-08 09:48:36 | 2019-07-08 10:48:35 | 0:59:59 | 0:20:14 | 0:39:45 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
dead | 4102075 | 2019-07-08 03:59:35 | 2019-07-08 09:49:54 | 2019-07-08 21:52:20 | 12:02:26 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |||
fail | 4102076 | 2019-07-08 03:59:36 | 2019-07-08 10:01:34 | 2019-07-08 10:31:33 | 0:29:59 | 0:08:09 | 0:21:50 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
a2 || wipefs --all /dev/mpatha2', 'item': u'mpatha2', u'stderr': u'wipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha2: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code'}, {'stderr_lines': [u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory'], u'changed': True, u'stdout': u'', u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'strip_empty_ends': True, u'_raw_params': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin_add_newline': True, u'stdin': None}}, u'delta': u'0:00:00.010981', 'stdout_lines': [], '_ansible_item_label': u'mpatha5', 'ansible_loop_var': u'item', u'end': u'2019-07-08 10:29:36.265667', '_ansible_no_log': False, u'start': u'2019-07-08 10:29:36.254686', u'failed': True, u'cmd': u'wipefs --force --all /dev/mpatha5 || wipefs --all /dev/mpatha5', 'item': u'mpatha5', u'stderr': u'wipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory\nwipefs: error: /dev/mpatha5: probing initialization failed: No such file or directory', u'rc': 1, u'msg': u'non-zero return code'}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'mpatha1') |
||||||||||||||
dead | 4102077 | 2019-07-08 03:59:37 | 2019-07-08 10:09:44 | 2019-07-08 22:12:07 | 12:02:23 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |||
fail | 4102078 | 2019-07-08 03:59:37 | 2019-07-08 10:11:50 | 2019-07-08 10:59:50 | 0:48:00 | 0:20:56 | 0:27:04 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira027 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102079 | 2019-07-08 03:59:38 | 2019-07-08 10:31:55 | 2019-07-08 22:34:22 | 12:02:27 | 11:46:26 | 0:16:01 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
psutil.NoSuchProcess process no longer exists (pid=4723) |
||||||||||||||
fail | 4102080 | 2019-07-08 03:59:39 | 2019-07-08 10:37:43 | 2019-07-08 11:33:43 | 0:56:00 | 0:20:13 | 0:35:47 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
dead | 4102081 | 2019-07-08 03:59:39 | 2019-07-08 10:48:55 | 2019-07-08 22:51:23 | 12:02:28 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |||
fail | 4102082 | 2019-07-08 03:59:40 | 2019-07-08 10:54:03 | 2019-07-08 11:38:02 | 0:43:59 | 0:10:25 | 0:33:34 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 4102083 | 2019-07-08 03:59:41 | 2019-07-08 10:59:58 | 2019-07-08 11:49:58 | 0:50:00 | 0:11:26 | 0:38:34 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira061 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102084 | 2019-07-08 03:59:41 | 2019-07-08 11:10:24 | 2019-07-08 12:14:24 | 1:04:00 | 0:10:20 | 0:53:40 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 4102085 | 2019-07-08 03:59:42 | 2019-07-08 11:24:06 | 2019-07-08 12:40:06 | 1:16:00 | 0:11:39 | 1:04:21 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira061 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102086 | 2019-07-08 03:59:43 | 2019-07-08 11:34:06 | 2019-07-08 12:14:06 | 0:40:00 | 0:10:24 | 0:29:36 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 4102087 | 2019-07-08 03:59:44 | 2019-07-08 11:38:24 | 2019-07-08 12:14:24 | 0:36:00 | 0:11:02 | 0:24:58 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira061 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102088 | 2019-07-08 03:59:44 | 2019-07-08 11:50:07 | 2019-07-08 12:48:07 | 0:58:00 | 0:10:20 | 0:47:40 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 4102089 | 2019-07-08 03:59:45 | 2019-07-08 12:14:23 | 2019-07-08 12:48:22 | 0:33:59 | 0:13:12 | 0:20:47 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102090 | 2019-07-08 03:59:46 | 2019-07-08 12:14:25 | 2019-07-08 13:06:25 | 0:52:00 | 0:11:58 | 0:40:02 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira061 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102091 | 2019-07-08 03:59:46 | 2019-07-08 12:14:26 | 2019-07-08 13:12:25 | 0:57:59 | 0:10:20 | 0:47:39 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira027 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 4102092 | 2019-07-08 03:59:47 | 2019-07-08 12:40:22 | 2019-07-08 13:18:22 | 0:38:00 | 0:10:56 | 0:27:04 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102093 | 2019-07-08 03:59:48 | 2019-07-08 12:48:25 | 2019-07-08 13:48:25 | 1:00:00 | 0:10:31 | 0:49:29 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira027 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 4102094 | 2019-07-08 03:59:48 | 2019-07-08 12:48:25 | 2019-07-08 13:32:25 | 0:44:00 | 0:11:53 | 0:32:07 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira061 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102095 | 2019-07-08 03:59:49 | 2019-07-08 13:06:33 | 2019-07-08 13:58:33 | 0:52:00 | 0:10:18 | 0:41:42 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 4102096 | 2019-07-08 03:59:50 | 2019-07-08 13:12:38 | 2019-07-08 13:50:38 | 0:38:00 | 0:11:11 | 0:26:49 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102097 | 2019-07-08 03:59:51 | 2019-07-08 13:18:30 | 2019-07-08 14:22:30 | 1:04:00 | 0:10:35 | 0:53:25 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira027 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
dead | 4102098 | 2019-07-08 03:59:51 | 2019-07-08 13:32:42 | 2019-07-09 01:35:06 | 12:02:24 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |||
fail | 4102099 | 2019-07-08 03:59:52 | 2019-07-08 13:48:33 | 2019-07-08 14:34:33 | 0:46:00 | 0:21:01 | 0:24:59 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira038 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
dead | 4102100 | 2019-07-08 03:59:53 | 2019-07-08 13:50:44 | 2019-07-09 01:53:07 | 12:02:23 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |||
fail | 4102101 | 2019-07-08 03:59:53 | 2019-07-08 13:58:42 | 2019-07-08 14:58:41 | 0:59:59 | 0:21:40 | 0:38:19 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
Command failed on mira064 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
fail | 4102102 | 2019-07-08 03:59:54 | 2019-07-08 14:22:53 | 2019-07-09 02:25:21 | 12:02:28 | 11:10:17 | 0:52:11 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
psutil.NoSuchProcess process no longer exists (pid=23178) |
||||||||||||||
fail | 4102103 | 2019-07-08 03:59:55 | 2019-07-08 14:34:41 | 2019-07-08 20:28:47 | 5:54:06 | 0:21:07 | 5:32:59 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
Command failed on mira046 with status 1: 'cd /home/ubuntu/cephtest && sudo ceph health' |
||||||||||||||
dead | 4102104 | 2019-07-08 03:59:56 | 2019-07-08 14:58:50 | 2019-07-08 16:02:50 | 1:04:00 | mira | master | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | — | |||
Failure Reason:
reached maximum tries (60) after waiting for 900 seconds |
||||||||||||||
dead | 4102105 | 2019-07-08 03:59:56 | 2019-07-08 16:02:59 | 2019-07-08 22:07:04 | 6:04:05 | mira | master | centos | 7.6 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | — | |||
Failure Reason:
reached maximum tries (60) after waiting for 900 seconds |