Name | Machine Type | Up | Locked | Locked Since | Locked By | OS Type | OS Version | Arch | Description |
---|---|---|---|---|---|---|---|---|---|
mira119.front.sepia.ceph.com | mira | True | True | 2023-06-12 16:44:49.985858 | amathuri@teuthology | x86_64 | None |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 5115573 | 2020-06-03 05:10:28 | 2020-06-03 05:10:33 | 2020-06-03 05:54:33 | 0:44:00 | 0:30:02 | 0:13:58 | mira | py2 | rhel | 7.5 | ceph-disk/basic/{distros/rhel_latest tasks/ceph-disk} | 2 | |
Failure Reason:
Command failed (workunit test ceph-disk/ceph-disk.sh) on mira119 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=5b9a0fcc4e3d178d3f3597e5a5dff2b061c58de5 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-disk/ceph-disk.sh' |
||||||||||||||
pass | 5112599 | 2020-06-02 14:42:48 | 2020-06-02 17:03:21 | 2020-06-02 17:19:20 | 0:15:59 | 0:09:58 | 0:06:01 | mira | master | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/bluestore-bitmap.yaml overrides.yaml rgw_pool_type/ec.yaml tasks/rgw_ragweed.yaml} | 2 | |||
pass | 5112564 | 2020-06-02 14:42:13 | 2020-06-02 15:44:11 | 2020-06-02 16:14:11 | 0:30:00 | 0:08:49 | 0:21:11 | mira | master | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/bluestore-bitmap.yaml overrides.yaml rgw_pool_type/ec-profile.yaml tasks/rgw_ragweed.yaml} | 2 | |||
pass | 5112560 | 2020-06-02 14:42:10 | 2020-06-02 15:34:11 | 2020-06-02 16:32:11 | 0:58:00 | 0:10:13 | 0:47:47 | mira | master | ubuntu | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/filestore-xfs.yaml overrides.yaml rgw_pool_type/replicated.yaml tasks/rgw_multipart_upload.yaml} | 2 | ||
fail | 5112554 | 2020-06-02 14:42:04 | 2020-06-02 15:24:11 | 2020-06-02 16:00:11 | 0:36:00 | 0:09:18 | 0:26:42 | mira | master | ubuntu | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/filestore-xfs.yaml overrides.yaml rgw_pool_type/ec-profile.yaml tasks/rgw_user_quota.yaml} | 2 | ||
Failure Reason:
"2020-06-02T15:56:45.160576+0000 mon.a (mon.0) 232 : cluster [WRN] Health check failed: Degraded data redundancy: 19/1358 objects degraded (1.399%), 7 pgs degraded (PG_DEGRADED)" in cluster log |
||||||||||||||
pass | 5112551 | 2020-06-02 14:42:01 | 2020-06-02 15:16:25 | 2020-06-02 15:44:25 | 0:28:00 | 0:16:35 | 0:11:25 | mira | master | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/bluestore-bitmap.yaml overrides.yaml rgw_pool_type/replicated.yaml tasks/rgw_s3tests.yaml} | 2 | |||
pass | 5112543 | 2020-06-02 14:41:53 | 2020-06-02 15:04:55 | 2020-06-02 15:22:55 | 0:18:00 | 0:10:16 | 0:07:44 | mira | master | ubuntu | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/bluestore-bitmap.yaml overrides.yaml rgw_pool_type/ec-profile.yaml tasks/rgw_multipart_upload.yaml} | 2 | ||
dead | 5112537 | 2020-06-02 14:41:46 | 2020-06-02 14:56:07 | 2020-06-02 15:06:06 | 0:09:59 | 0:02:00 | 0:07:59 | mira | master | ubuntu | rgw/multifs/{clusters/fixed-2.yaml frontend/civetweb.yaml objectstore/bluestore-bitmap.yaml overrides.yaml rgw_pool_type/ec.yaml tasks/rgw_user_quota.yaml} | 2 | ||
Failure Reason:
{'Failure object was': {'mira114.front.sepia.ceph.com': {'results': [{'changed': True, 'end': '2020-06-02 15:04:06.917343', 'stdout': ' Labels on physical volume "/dev/sdb" successfully wiped.', 'cmd': 'pvremove --force --force --yes /dev/sdb', 'rc': 0, 'start': '2020-06-02 15:04:06.884654', 'stderr': ' WARNING: Device for PV KK9TiO-nbAe-IHnv-JPFJ-e3zS-oYDs-8j29SU not found or rejected by a filter.\\n WARNING: Device for PV y9MZw1-iBav-9jWK-bo1q-ZBNm-ZyRq-GuGFbw not found or rejected by a filter.\\n WARNING: Device for PV UYOQbp-jMOL-002t-Yx3v-N4P5-XcOf-FxAkTe not found or rejected by a filter.\\n WARNING: Device for PV cNRJHB-Z78L-IqTf-EOzk-8SsX-lg3u-8NhKNK not found or rejected by a filter.\\n WARNING: Device for PV 1doS33-AEnt-fmfR-edml-OAOe-yF5q-zHbc0R not found or rejected by a filter.\\n WARNING: PV /dev/sdb is used by VG vg_hdd.\\n WARNING: Wiping physical volume label from /dev/sdb of volume group "vg_hdd".', 'delta': '0:00:00.032689', 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes /dev/sdb', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'stdout_lines': [' Labels on physical volume "/dev/sdb" successfully wiped.'], 'stderr_lines': [' WARNING: Device for PV KK9TiO-nbAe-IHnv-JPFJ-e3zS-oYDs-8j29SU not found or rejected by a filter.', ' WARNING: Device for PV y9MZw1-iBav-9jWK-bo1q-ZBNm-ZyRq-GuGFbw not found or rejected by a filter.', ' WARNING: Device for PV UYOQbp-jMOL-002t-Yx3v-N4P5-XcOf-FxAkTe not found or rejected by a filter.', ' WARNING: Device for PV cNRJHB-Z78L-IqTf-EOzk-8SsX-lg3u-8NhKNK not found or rejected by a filter.', ' WARNING: Device for PV 1doS33-AEnt-fmfR-edml-OAOe-yF5q-zHbc0R not found or rejected by a filter.', ' WARNING: PV /dev/sdb is used by VG vg_hdd.', ' WARNING: Wiping physical volume label from /dev/sdb of volume group "vg_hdd".'], '_ansible_no_log': False, 'failed': False, 'item': '/dev/sdb', 'ansible_loop_var': 'item', '_ansible_item_label': '/dev/sdb'}, {'changed': True, 'end': '2020-06-02 15:04:07.104503', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026684', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 15:04:07.077819', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 15:04:07.285177', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.027763', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 15:04:07.257414', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 15:04:07.457979', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026142', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 15:04:07.431837', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 15:04:07.636242', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026770', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 15:04:07.609472', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}, {'changed': True, 'end': '2020-06-02 15:04:07.813978', 'stdout': '', 'cmd': 'pvremove --force --force --yes [unknown]', 'failed': True, 'delta': '0:00:00.026778', 'stderr': ' Device [unknown] not found.', 'rc': 5, 'invocation': {'module_args': {'creates': 'None', 'executable': 'None', '_uses_shell': True, 'strip_empty_ends': True, '_raw_params': 'pvremove --force --force --yes [unknown]', 'removes': 'None', 'argv': 'None', 'warn': True, 'chdir': 'None', 'stdin_add_newline': True, 'stdin': 'None'}}, 'start': '2020-06-02 15:04:07.787200', 'msg': 'non-zero return code', 'stdout_lines': [], 'stderr_lines': [' Device [unknown] not found.'], '_ansible_no_log': False, 'item': '[unknown]', 'ansible_loop_var': 'item', '_ansible_item_label': '[unknown]'}], 'changed': True, 'msg': 'All items completed'}}, 'Traceback (most recent call last)': 'File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 306, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/__init__.py", line 278, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 27, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 199, in represent_list return self.represent_sequence(\'tag:yaml.org,2002:seq\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 92, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 48, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 207, in represent_dict return self.represent_mapping(\'tag:yaml.org,2002:map\', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 118, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/lib/python3.6/site-packages/yaml/representer.py", line 231, in represent_undefined raise RepresenterError("cannot represent an object", data)', 'yaml.representer.RepresenterError': "('cannot represent an object', '/dev/sdb')"} |
||||||||||||||
fail | 5112517 | 2020-06-02 14:41:27 | 2020-06-02 14:41:52 | 2020-06-02 14:55:51 | 0:13:59 | 0:08:04 | 0:05:55 | mira | master | rgw/crypt/{0-cluster/fixed-1.yaml 1-ceph-install/install.yaml 2-kms/barbican.yaml 3-rgw/rgw.yaml 4-tests/{s3tests.yaml}} | 1 | |||
Failure Reason:
Command failed on mira119 with status 1: 'cd /home/ubuntu/cephtest/keystone && source /home/ubuntu/cephtest/tox-venv/bin/activate && tox -e venv --notest' |
||||||||||||||
pass | 5109663 | 2020-06-01 05:55:41 | 2020-06-01 15:12:20 | 2020-06-01 19:04:25 | 3:52:05 | 2:27:26 | 1:24:39 | mira | py2 | centos | 7.8 | ceph-deploy/{cluster/4node config/ceph_volume_bluestore_dmcrypt distros/centos_latest python_versions/python_2 tasks/ceph-admin-commands} | 4 | |
fail | 5102004 | 2020-05-29 05:55:59 | 2020-05-29 06:59:42 | 2020-05-29 07:21:41 | 0:21:59 | 0:07:44 | 0:14:15 | mira | py2 | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk config_options/cephdeploy_conf distros/centos_7.4 objectstore/bluestore-bitmap python_versions/python_2 tasks/ceph-admin-commands} | 2 | |
Failure Reason:
Command failed on mira045 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 5101993 | 2020-05-29 05:55:50 | 2020-05-29 06:37:40 | 2020-05-29 06:59:39 | 0:21:59 | 0:07:56 | 0:14:03 | mira | py2 | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk config_options/cephdeploy_conf distros/centos_7.4 objectstore/filestore-xfs python_versions/python_2 tasks/ceph-admin-commands} | 2 | |
Failure Reason:
Command failed on mira045 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 5101986 | 2020-05-29 05:55:43 | 2020-05-29 06:23:33 | 2020-05-29 06:37:33 | 0:14:00 | 0:04:10 | 0:09:50 | mira | py2 | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk config_options/cephdeploy_conf distros/ubuntu_16.04 objectstore/bluestore-bitmap python_versions/python_2 tasks/ceph-admin-commands} | 2 | |
Failure Reason:
Command failed on mira045 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 5101969 | 2020-05-29 05:55:27 | 2020-05-29 05:55:31 | 2020-05-29 06:23:31 | 0:28:00 | 0:07:55 | 0:20:05 | mira | py2 | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt config_options/cephdeploy_conf distros/centos_7.4 objectstore/bluestore-bitmap python_versions/python_2 tasks/ceph-admin-commands} | 2 | |
Failure Reason:
Command failed on mira045 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
pass | 5094203 | 2020-05-27 05:10:28 | 2020-05-27 05:10:44 | 2020-05-27 05:40:44 | 0:30:00 | 0:18:29 | 0:11:31 | mira | py2 | rhel | 7.5 | ceph-disk/basic/{distros/rhel_latest tasks/ceph-detect-init} | 1 | |
pass | 5092964 | 2020-05-26 16:09:58 | 2020-05-26 17:05:59 | 2020-05-26 18:06:00 | 1:00:01 | 0:28:47 | 0:31:14 | mira | py2 | centos | 7.8 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/centos_latest.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 4 | |
pass | 5092943 | 2020-05-26 16:09:38 | 2020-05-26 16:09:40 | 2020-05-26 17:15:40 | 1:06:00 | 0:31:49 | 0:34:11 | mira | py2 | centos | 7.8 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/centos_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
pass | 5091007 | 2020-05-25 05:55:43 | 2020-05-25 06:40:08 | 2020-05-25 16:12:23 | 9:32:15 | 0:14:51 | 9:17:24 | mira | py2 | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 | |
fail | 5091002 | 2020-05-25 05:55:39 | 2020-05-25 06:24:06 | 2020-05-25 07:00:05 | 0:35:59 | 0:11:13 | 0:24:46 | mira | py2 | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed to create osds |
||||||||||||||
pass | 5090996 | 2020-05-25 05:55:34 | 2020-05-25 05:55:46 | 2020-05-25 06:31:47 | 0:36:01 | 0:15:32 | 0:20:29 | mira | py2 | ubuntu | 18.04 | ceph-deploy/{cluster/4node.yaml config/ceph_volume_bluestore_dmcrypt.yaml distros/ubuntu_latest.yaml python_versions/python_2.yaml tasks/rbd_import_export.yaml} | 4 |