Name Machine Type Up Locked Locked Since Locked By OS Type OS Version Arch Description
mira018.front.sepia.ceph.com mira True True 2020-06-29 21:10:16.119555 kyrylo.shatskyy@suse.com centos 7 x86_64 None
Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
fail 5158531 2020-06-17 19:37:40 2020-06-17 19:37:41 2020-06-17 20:03:40 0:25:59 0:07:41 0:18:18 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_dmcrypt_off distros/centos_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Command failed on mira117 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.rrxkRREkMp'

fail 5153642 2020-06-16 13:26:38 2020-06-16 13:26:42 2020-06-16 13:48:42 0:22:00 0:04:03 0:17:57 mira master ubuntu 20.04 upgrade:nautilus-x:stress-split/{0-cluster/{openstack.yaml start.yaml} 1-ceph-install/nautilus.yaml 1.1-pg-log-overrides/normal_pg_log.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{radosbench.yaml rbd-cls.yaml rbd-import-export.yaml rbd_api.yaml readwrite.yaml rgw_ragweed_prepare.yaml snaps-few-objects.yaml} 5-finish-upgrade.yaml 6-octopus.yaml 7-msgr2.yaml 8-final-workload/{rbd-python.yaml snaps-many-objects.yaml} objectstore/filestore-xfs.yaml thrashosds-health.yaml ubuntu_latest.yaml} 5
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=ubuntu%2F20.04%2Fx86_64&ref=nautilus

fail 5129669 2020-06-08 17:02:08 2020-06-08 17:54:09 2020-06-08 18:08:08 0:13:59 0:04:05 0:09:54 mira py2 ubuntu 16.04 ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk config_options/cephdeploy_conf distros/ubuntu_16.04 objectstore/filestore-xfs python_versions/python_2 tasks/ceph-admin-commands} 2
Failure Reason:

Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target'

fail 5129662 2020-06-08 17:02:02 2020-06-08 17:40:08 2020-06-08 17:54:07 0:13:59 0:04:10 0:09:49 mira py2 ubuntu 16.04 ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt config_options/cephdeploy_conf distros/ubuntu_16.04 objectstore/bluestore-bitmap python_versions/python_3 tasks/ceph-admin-commands} 2
Failure Reason:

Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target'

fail 5129649 2020-06-08 17:01:50 2020-06-08 17:15:50 2020-06-08 17:31:49 0:15:59 0:04:52 0:11:07 mira py2 ubuntu 16.04 ceph-deploy/ceph-volume/{cluster/4node config/ceph_volume_filestore distros/ubuntu_latest tasks/rbd_import_export} 4
Failure Reason:

Command failed on mira041 with status 1: 'cd /home/ubuntu/cephtest/ceph-deploy && ./bootstrap'

fail 5129647 2020-06-08 17:01:48 2020-06-08 17:01:50 2020-06-08 17:15:49 0:13:59 0:04:14 0:09:45 mira py2 ubuntu 16.04 ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk config_options/cephdeploy_conf distros/ubuntu_16.04 objectstore/filestore-xfs python_versions/python_2 tasks/ceph-admin-commands} 2
Failure Reason:

Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target'

dead 5128372 2020-06-08 05:55:49 2020-06-08 07:10:22 2020-06-08 07:54:23 0:44:01 0:04:08 0:39:53 mira master ubuntu 18.04 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/ubuntu_latest python_versions/python_3 tasks/rbd_import_export} 4
Failure Reason:

{'Failure object was': {'mira053.front.sepia.ceph.com': {'results': [{'changed': True, 'end': '2020-06-08 07:52:15.565763', 'stdout': ' Labels on physical volume "/dev/sdb" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdb', 'rc': 0, 'start': '2020-06-08 07:52:15.537726', 'stderr': ' WARNING: Device for PV ndFbuc-mF1t-ifkh-zRWz-CJOw-3eZY-hVZ5ej not found or rejected by a filter.\\n WARNING: Device for PV ejCzT6-vp0S-dewn-em2n-xdwY-DnFi-ZathMY not found or rejected by a filter.\\n WARNING: Device for PV 81XTZP-eGvS-Dk8H-JGy3-5Lqf-B0Fm-lwW1ou not found or rejected by a filter.\\n WARNING: Device for PV UdTrZs-GK7M-Lwr8-J0mM-xfZp-pWuD-llIdkK not found or rejected by a filter.\\n WARNING: Device for PV xkML0o-uZze-f739-envD-J4vB-fZv3-Lbv6vh not found or rejected by a filter.\\n WARNING: Device for PV HaebLh-by8C-9EUs-aqYW-1SbN-kyN1-GUTiTQ not found or rejected by a filter.\\n WARNING: PV /dev/sdb is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdb of volume group "vg_hdd".', 'delta': '0:00:00.028037', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdb', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdb" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV ndFbuc-mF1t-ifkh-zRWz-CJOw-3eZY-hVZ5ej not found or rejected by a filter.', ' WARNING: Device for PV ejCzT6-vp0S-dewn-em2n-xdwY-DnFi-ZathMY not found or rejected by a filter.', ' WARNING: Device for PV 81XTZP-eGvS-Dk8H-JGy3-5Lqf-B0Fm-lwW1ou not found or rejected by a filter.', ' WARNING: Device for PV UdTrZs-GK7M-Lwr8-J0mM-xfZp-pWuD-llIdkK not found or rejected by a filter.', ' WARNING: Device for PV xkML0o-uZze-f739-envD-J4vB-fZv3-Lbv6vh not found or rejected by a filter.', ' WARNING: Device for PV HaebLh-by8C-9EUs-aqYW-1SbN-kyN1-GUTiTQ not found or rejected by a filter.', ' WARNING: PV /dev/sdb is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdb of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdb', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdb'}, {'changed': True, 'end': '2020-06-08 07:52:15.752831', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.025124', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-08 07:52:15.727707', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-08 07:52:15.929934', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.024921', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-08 07:52:15.905013', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-08 07:52:16.109072', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.027470', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-08 07:52:16.081602', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-08 07:52:16.295265', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026319', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-08 07:52:16.268946', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-08 07:52:16.484218', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.023687', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-08 07:52:16.460531', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-08 07:52:16.681372', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.024396', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-08 07:52:16.656976', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}], 'changed': True, 'msg': 'All items completed'}}, 'Traceback (most recent call last)': 'File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 306, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 278, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 27, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 199, in represent_list return self.represent_sequence(\'tag:yaml.org,2002:seq\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 92, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 231, in represent_undefined raise RepresenterError("cannot represent an object", data)', 'yaml.representer.RepresenterError': "('cannot represent an object', '/dev/sdb')"}

pass 5128361 2020-06-08 05:55:40 2020-06-08 06:29:49 2020-06-08 07:43:50 1:14:01 0:38:38 0:35:23 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_3 tasks/rbd_import_export} 4
fail 5128348 2020-06-08 05:55:28 2020-06-08 05:55:29 2020-06-08 06:39:29 0:44:00 0:12:00 0:32:00 mira master centos 7.8 ceph-deploy/{cluster/4node config/ceph_volume_bluestore distros/centos_latest python_versions/python_2 tasks/rbd_import_export} 4
Failure Reason:

Command failed on mira072 with status 2: 'sudo tar cz -f - -C /var/lib/ceph/mon -- . > /tmp/tmp.Fqcdi5nxTV'

fail 5122582 2020-06-06 20:43:32 2020-06-06 21:40:13 2020-06-06 21:52:12 0:11:59 0:05:44 0:06:15 mira master ubuntu rgw:verify/{clusters/fixed-2 frontend/beast msgr-failures/few objectstore/bluestore-bitmap overrides proto/http rgw_pool_type/ec-profile tasks/{0-install cls ragweed s3tests-java s3tests} validater/valgrind} 2
Failure Reason:

No module named 'cStringIO'

fail 5122573 2020-06-06 20:43:24 2020-06-06 21:25:56 2020-06-06 21:39:55 0:13:59 0:06:47 0:07:12 mira master ubuntu rgw:verify/{clusters/fixed-2 frontend/civetweb msgr-failures/few objectstore/filestore-xfs overrides proto/http rgw_pool_type/ec-profile tasks/{0-install cls ragweed s3tests-java s3tests} validater/lockdep} 2
Failure Reason:

No module named 'cStringIO'

fail 5122565 2020-06-06 20:43:17 2020-06-06 21:11:39 2020-06-06 21:25:39 0:14:00 0:06:58 0:07:02 mira master ubuntu rgw:verify/{clusters/fixed-2 frontend/civetweb msgr-failures/few objectstore/bluestore-bitmap overrides proto/http rgw_pool_type/ec tasks/{0-install cls ragweed s3tests-java s3tests} validater/lockdep} 2
Failure Reason:

No module named 'cStringIO'

fail 5122554 2020-06-06 20:43:07 2020-06-06 20:57:23 2020-06-06 21:11:23 0:14:00 0:05:59 0:08:01 mira master ubuntu rgw:verify/{clusters/fixed-2 frontend/civetweb msgr-failures/few objectstore/bluestore-bitmap overrides proto/https rgw_pool_type/replicated tasks/{0-install cls ragweed s3tests-java s3tests} validater/valgrind} 2
Failure Reason:

No module named 'cStringIO'

fail 5122540 2020-06-06 20:42:54 2020-06-06 20:43:09 2020-06-06 20:57:08 0:13:59 0:06:21 0:07:38 mira master ubuntu rgw:verify/{clusters/fixed-2 frontend/civetweb msgr-failures/few objectstore/filestore-xfs overrides proto/https rgw_pool_type/ec-profile tasks/{0-install cls ragweed s3tests-java s3tests} validater/valgrind} 2
Failure Reason:

No module named 'cStringIO'

fail 5119083 2020-06-05 05:55:52 2020-06-05 06:56:00 2020-06-05 07:26:00 0:30:00 0:08:11 0:21:49 mira py2 centos 7.4 ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk config_options/cephdeploy_conf distros/centos_7.4 objectstore/bluestore-bitmap python_versions/python_2 tasks/ceph-admin-commands} 2
Failure Reason:

Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target'

fail 5119070 2020-06-05 05:55:40 2020-06-05 06:34:00 2020-06-05 06:55:59 0:21:59 0:08:06 0:13:53 mira py2 centos 7.4 ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk config_options/cephdeploy_conf distros/centos_7.4 objectstore/filestore-xfs python_versions/python_3 tasks/ceph-admin-commands} 2
Failure Reason:

Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target'

fail 5119061 2020-06-05 05:55:32 2020-06-05 06:13:58 2020-06-05 06:33:58 0:20:00 0:04:03 0:15:57 mira py2 ubuntu 16.04 ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk config_options/cephdeploy_conf distros/ubuntu_16.04 objectstore/bluestore-bitmap python_versions/python_3 tasks/ceph-admin-commands} 2
Failure Reason:

Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target'

fail 5119052 2020-06-05 05:55:25 2020-06-05 05:55:40 2020-06-05 06:13:39 0:17:59 0:04:07 0:13:52 mira py2 ubuntu 16.04 ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk config_options/cephdeploy_conf distros/ubuntu_16.04 objectstore/filestore-xfs python_versions/python_3 tasks/ceph-admin-commands} 2
Failure Reason:

Command failed on mira018 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target'

pass 5112604 2020-06-02 14:42:53 2020-06-02 17:07:32 2020-06-02 17:31:31 0:23:59 0:16:25 0:07:34 mira master rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/filestore-xfs.yaml overrides.yaml rgw_pool_type/replicated.yaml tasks/rgw_s3tests.yaml} 2
fail 5112520 2020-06-02 14:41:30 2020-06-02 14:41:54 2020-06-02 15:33:54 0:52:00 0:44:18 0:07:42 mira master ubuntu rgw/multisite/{clusters.yaml frontend/beast.yaml omap_limits.yaml overrides.yaml realms/three-zone-plus-pubsub.yaml tasks/test_multi.yaml valgrind.yaml} 2
Failure Reason:

"2020-06-02T14:54:07.178335+0000 mon.a (mon.0) 152 : cluster [WRN] Health check failed: Reduced data availability: 5 pgs inactive, 5 pgs peering (PG_AVAILABILITY)" in cluster log