Name | Machine Type | Up | Locked | Locked Since | Locked By | OS Type | OS Version | Arch | Description |
---|---|---|---|---|---|---|---|---|---|
mira063.front.sepia.ceph.com | mira | True | True | 2020-07-01 00:29:49.910515 | kyr@teuthology | centos | 7 | x86_64 | None |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 5132459 | 2020-06-10 05:10:27 | 2020-06-10 05:10:46 | 2020-06-10 05:38:45 | 0:27:59 | 0:13:54 | 0:14:05 | mira | py2 | centos | 7.4 | ceph-disk/basic/{distros/centos_latest tasks/ceph-disk} | 2 | |
Failure Reason:
Command failed (workunit test ceph-disk/ceph-disk.sh) on mira063 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8e9675a830b7e7a7164659a2a8f8ad08b8f61358 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-disk/ceph-disk.sh' |
||||||||||||||
fail | 5122576 | 2020-06-06 20:43:26 | 2020-06-06 21:34:00 | 2020-06-06 21:45:58 | 0:11:58 | 0:05:51 | 0:06:07 | mira | master | ubuntu | rgw:verify/{clusters/fixed-2 frontend/beast msgr-failures/few objectstore/bluestore-bitmap overrides proto/https rgw_pool_type/ec-profile tasks/{0-install cls ragweed s3tests-java s3tests} validater/valgrind} | 2 | ||
Failure Reason:
No module named 'cStringIO' |
||||||||||||||
fail | 5122566 | 2020-06-06 20:43:18 | 2020-06-06 21:21:40 | 2020-06-06 21:33:39 | 0:11:59 | 0:05:38 | 0:06:21 | mira | master | ubuntu | rgw:verify/{clusters/fixed-2 frontend/beast msgr-failures/few objectstore/filestore-xfs overrides proto/https rgw_pool_type/replicated tasks/{0-install cls ragweed s3tests-java s3tests} validater/valgrind} | 2 | ||
Failure Reason:
No module named 'cStringIO' |
||||||||||||||
fail | 5122557 | 2020-06-06 20:43:10 | 2020-06-06 21:09:25 | 2020-06-06 21:21:23 | 0:11:58 | 0:05:43 | 0:06:15 | mira | master | ubuntu | rgw:verify/{clusters/fixed-2 frontend/beast msgr-failures/few objectstore/filestore-xfs overrides proto/https rgw_pool_type/replicated tasks/{0-install cls ragweed s3tests-java s3tests} validater/lockdep} | 2 | ||
Failure Reason:
No module named 'cStringIO' |
||||||||||||||
fail | 5122548 | 2020-06-06 20:43:01 | 2020-06-06 20:57:23 | 2020-06-06 21:09:22 | 0:11:59 | 0:05:47 | 0:06:12 | mira | master | ubuntu | rgw:verify/{clusters/fixed-2 frontend/civetweb msgr-failures/few objectstore/filestore-xfs overrides proto/http rgw_pool_type/replicated tasks/{0-install cls ragweed s3tests-java s3tests} validater/valgrind} | 2 | ||
Failure Reason:
No module named 'cStringIO' |
||||||||||||||
fail | 5122543 | 2020-06-06 20:42:57 | 2020-06-06 20:43:08 | 2020-06-06 20:57:07 | 0:13:59 | 0:06:33 | 0:07:26 | mira | master | ubuntu | rgw:verify/{clusters/fixed-2 frontend/beast msgr-failures/few objectstore/bluestore-bitmap overrides proto/https rgw_pool_type/ec-profile tasks/{0-install cls ragweed s3tests-java s3tests} validater/lockdep} | 2 | ||
Failure Reason:
No module named 'cStringIO' |
||||||||||||||
pass | 5115574 | 2020-06-03 05:10:29 | 2020-06-03 05:10:48 | 2020-06-03 05:22:47 | 0:11:59 | 0:04:30 | 0:07:29 | mira | py2 | ubuntu | 16.04 | ceph-disk/basic/{distros/ubuntu_latest tasks/ceph-detect-init} | 1 | |
pass | 5112599 | 2020-06-02 14:42:48 | 2020-06-02 17:03:21 | 2020-06-02 17:19:20 | 0:15:59 | 0:09:58 | 0:06:01 | mira | master | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/bluestore-bitmap.yaml overrides.yaml rgw_pool_type/ec.yaml tasks/rgw_ragweed.yaml} | 2 | |||
pass | 5112584 | 2020-06-02 14:42:33 | 2020-06-02 16:36:48 | 2020-06-02 16:52:47 | 0:15:59 | 0:09:04 | 0:06:55 | mira | master | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/filestore-xfs.yaml overrides.yaml rgw_pool_type/replicated.yaml tasks/rgw_ragweed.yaml} | 2 | |||
fail | 5112566 | 2020-06-02 14:42:15 | 2020-06-02 15:45:44 | 2020-06-02 16:37:45 | 0:52:01 | 0:16:03 | 0:35:58 | mira | master | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/filestore-xfs.yaml overrides.yaml rgw_pool_type/ec.yaml tasks/rgw_s3tests.yaml} | 2 | |||
Failure Reason:
"2020-06-02T16:26:51.765898+0000 mon.b (mon.0) 182 : cluster [WRN] Health check failed: Reduced data availability: 2 pgs inactive, 2 pgs peering (PG_AVAILABILITY)" in cluster log |
||||||||||||||
fail | 5112548 | 2020-06-02 14:41:58 | 2020-06-02 15:08:23 | 2020-06-02 15:46:23 | 0:38:00 | 0:26:06 | 0:11:54 | mira | master | ubuntu | rgw/multisite/{clusters.yaml frontend/civetweb.yaml omap_limits.yaml overrides.yaml realms/two-zonegroup.yaml tasks/test_multi.yaml valgrind.yaml} | 2 | ||
Failure Reason:
"2020-06-02T15:24:10.821510+0000 mon.a (mon.0) 147 : cluster [WRN] Health check failed: Reduced data availability: 4 pgs inactive, 4 pgs peering (PG_AVAILABILITY)" in cluster log |
||||||||||||||
fail | 5112533 | 2020-06-02 14:41:42 | 2020-06-02 14:50:56 | 2020-06-02 15:12:55 | 0:21:59 | 0:16:02 | 0:05:57 | mira | master | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/filestore-xfs.yaml overrides.yaml rgw_pool_type/ec-profile.yaml tasks/rgw_s3tests.yaml} | 2 | |||
Failure Reason:
"2020-06-02T15:03:11.430090+0000 mon.b (mon.0) 180 : cluster [WRN] Health check failed: Reduced data availability: 1 pg inactive, 1 pg peering (PG_AVAILABILITY)" in cluster log |
||||||||||||||
dead | 5112523 | 2020-06-02 14:41:33 | 2020-06-02 14:41:53 | 2020-06-02 14:51:52 | 0:09:59 | 0:02:29 | 0:07:30 | mira | master | ubuntu | rgw/thrash/{civetweb.yaml clusters/fixed-2.yaml install.yaml objectstore/bluestore-bitmap.yaml thrasher/default.yaml thrashosds-health.yaml workload/rgw_bucket_quota.yaml} | 2 | ||
Failure Reason:
{'Failure object was': {'mira066.front.sepia.ceph.com': {'results': [{'changed': True, 'end': '2020-06-02 14:49:31.395968', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.025843', 'stderr': ' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.\\n WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.\\n WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.\\n WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.\\n WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 14:49:31.370125', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.', ' WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.', ' WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.', ' WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.', ' WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 14:49:31.599937', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.027320', 'stderr': ' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.\\n WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.\\n WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.\\n WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.\\n WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 14:49:31.572617', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.', ' WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.', ' WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.', ' WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.', ' WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 14:49:31.797345', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026941', 'stderr': ' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.\\n WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.\\n WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.\\n WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.\\n WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 14:49:31.770404', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.', ' WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.', ' WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.', ' WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.', ' WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 14:49:31.973446', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.028390', 'stderr': ' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.\\n WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.\\n WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.\\n WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.\\n WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.\\n Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 14:49:31.945056', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.', ' WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.', ' WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.', ' WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.', ' WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.', ' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 14:49:32.164234', 'stdout': ' Labels on physical volume "/dev/sdf" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdf', 'rc': 0, 'start': '2020-06-02 14:49:32.134073', 'stderr': ' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.\\n WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.\\n WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.\\n WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.\\n WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.\\n WARNING: PV /dev/sdf is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdf of volume group "vg_hdd".', 'delta': '0:00:00.030161', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdf', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdf" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV MK3qBX-AkmP-Gj0J-Cp47-UlXU-9LL5-FDeBX6 not found or rejected by a filter.', ' WARNING: Device for PV gZT2eh-57Bx-nzel-yDY7-CmkG-043l-hXVZ4M not found or rejected by a filter.', ' WARNING: Device for PV fvV7Ft-SdbD-iHkM-HblU-2rnd-1z5k-3FQ0sQ not found or rejected by a filter.', ' WARNING: Device for PV mh6ya9-6brd-XLu9-24iK-SC04-4Ec9-Kj6qA2 not found or rejected by a filter.', ' WARNING: Device for PV VyUHeG-6MGc-2Y9o-glcH-Nkuv-11YN-hwRUBg not found or rejected by a filter.', ' WARNING: PV /dev/sdf is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdf of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdf', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdf'}, {'changed': True, 'end': '2020-06-02 14:49:32.349960', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.025783', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 14:49:32.324177', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}], 'changed': True, 'msg': 'All items completed'}}, 'Traceback (most recent call last)': 'File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 306, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 278, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 27, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 199, in represent_list return self.represent_sequence(\'tag:yaml.org,2002:seq\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 92, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 231, in represent_undefined raise RepresenterError("cannot represent an object", data)', 'yaml.representer.RepresenterError': "('cannot represent an object', '[unknown]')"} |
||||||||||||||
pass | 5109663 | 2020-06-01 05:55:41 | 2020-06-01 15:12:20 | 2020-06-01 19:04:25 | 3:52:05 | 2:27:26 | 1:24:39 | mira | py2 | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
fail | 5101999 | 2020-05-29 05:55:55 | 2020-05-29 06:49:38 | 2020-05-29 07:03:37 | 0:13:59 | 0:04:01 | 0:09:58 | mira | py2 | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk config_options/cephdeploy_conf distros/ubuntu_16.04 objectstore/filestore-xfs python_versions/python_2 tasks/ceph-admin-commands} | 2 | |
Failure Reason:
Command failed on mira063 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
pass | 5101970 | 2020-05-29 05:55:28 | 2020-05-29 05:55:31 | 2020-05-29 06:47:32 | 0:52:01 | 0:22:25 | 0:29:36 | mira | py2 | centos | 7.4 | ceph-deploy/ceph-volume/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest tasks/rbd_import_export} | 4 | |
fail | 5094202 | 2020-05-27 05:10:28 | 2020-05-27 05:10:44 | 2020-05-27 05:48:44 | 0:38:00 | 0:14:11 | 0:23:49 | mira | py2 | centos | 7.4 | ceph-disk/basic/{distros/centos_latest tasks/ceph-disk} | 2 | |
Failure Reason:
Command failed (workunit test ceph-disk/ceph-disk.sh) on mira063 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=5b9a0fcc4e3d178d3f3597e5a5dff2b061c58de5 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-disk/ceph-disk.sh' |
||||||||||||||
pass | 5092969 | 2020-05-26 16:10:03 | 2020-05-26 17:32:17 | 2020-05-26 18:00:16 | 0:27:59 | 0:15:13 | 0:12:46 | mira | py2 | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
pass | 5092960 | 2020-05-26 16:09:55 | 2020-05-26 17:03:50 | 2020-05-26 17:33:50 | 0:30:00 | 0:20:37 | 0:09:23 | mira | py2 | centos | 7.8 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/rbd_import_export.yaml} | 4 | |
pass | 5092953 | 2020-05-26 16:09:48 | 2020-05-26 16:36:03 | 2020-05-26 17:04:03 | 0:28:00 | 0:18:52 | 0:09:08 | mira | py2 | centos | 7.8 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 |