Name | Machine Type | Up | Locked | Locked Since | Locked By | OS Type | OS Version | Arch | Description |
---|---|---|---|---|---|---|---|---|---|
vpm003.front.sepia.ceph.com | vps | True | False | ubuntu | 16.04 | x86_64 | None |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pass | 2173955 |
![]() |
2018-02-08 23:07:08 | 2018-02-08 23:07:36 | 2018-02-08 23:35:36 | 0:28:00 | 0:21:44 | 0:06:16 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |
fail | 2117830 |
![]() ![]() |
2018-01-27 05:55:40 | 2018-01-27 05:55:41 | 2018-01-27 06:23:41 | 0:28:00 | 0:20:58 | 0:07:02 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
ceph-deploy: Failed to zap osds |
||||||||||||||
pass | 2116747 |
![]() |
2018-01-27 03:59:53 | 2018-01-27 03:59:54 | 2018-01-27 04:39:54 | 0:40:00 | 0:18:22 | 0:21:38 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
fail | 2116735 |
![]() ![]() |
2018-01-27 03:59:45 | 2018-01-27 03:59:52 | 2018-01-27 04:15:51 | 0:15:59 | 0:10:31 | 0:05:28 | vps | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
Command failed on vpm003 with status 5: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 2109007 |
![]() ![]() |
2018-01-25 05:55:37 | 2018-01-25 05:55:45 | 2018-01-25 06:19:44 | 0:23:59 | 0:19:22 | 0:04:37 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
ceph-deploy: Failed to zap osds |
||||||||||||||
fail | 2103403 |
![]() ![]() |
2018-01-23 05:55:45 | 2018-01-23 05:55:53 | 2018-01-23 06:23:53 | 0:28:00 | 0:20:34 | 0:07:26 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
ceph-deploy: Failed to zap osds |
||||||||||||||
pass | 2098358 |
![]() |
2018-01-22 03:59:36 | 2018-01-22 03:59:37 | 2018-01-22 04:25:37 | 0:26:00 | 0:20:45 | 0:05:15 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
fail | 2094714 |
![]() ![]() |
2018-01-20 05:55:49 | 2018-01-20 06:06:18 | 2018-01-20 06:32:15 | 0:25:57 | 0:16:54 | 0:09:03 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
ceph-deploy: Failed to zap osds |
||||||||||||||
fail | 2094712 |
![]() ![]() |
2018-01-20 05:55:47 | 2018-01-20 05:55:48 | 2018-01-20 06:09:48 | 0:14:00 | 0:08:51 | 0:05:09 | vps | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
Command failed on vpm003 with status 1: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
pass | 2093603 |
![]() |
2018-01-20 03:59:50 | 2018-01-20 04:00:01 | 2018-01-20 04:22:00 | 0:21:59 | 0:16:16 | 0:05:43 | vps | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
fail | 2086284 |
![]() ![]() |
2018-01-18 05:55:57 | 2018-01-18 06:33:55 | 2018-01-18 07:17:55 | 0:44:00 | 0:08:32 | 0:35:28 | vps | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
Command failed on vpm003 with status 1: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 2086272 |
![]() ![]() |
2018-01-18 05:55:50 | 2018-01-18 06:19:48 | 2018-01-18 06:49:47 | 0:29:59 | 0:09:02 | 0:20:57 | vps | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
Command failed on vpm003 with status 1: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
fail | 2086256 |
![]() ![]() |
2018-01-18 05:55:39 | 2018-01-18 05:55:44 | 2018-01-18 06:19:43 | 0:23:59 | 0:19:02 | 0:04:57 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
ceph-deploy: Failed to zap osds |
||||||||||||||
fail | 2086249 |
![]() ![]() |
2018-01-18 05:55:35 | 2018-01-18 05:55:45 | 2018-01-18 06:29:44 | 0:33:59 | vps | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | ||
Failure Reason:
Could not reconnect to ubuntu@vpm135.front.sepia.ceph.com |
||||||||||||||
fail | 2086239 |
![]() ![]() |
2018-01-18 05:55:29 | 2018-01-18 05:55:46 | 2018-01-18 07:03:47 | 1:08:01 | 0:08:49 | 0:59:12 | vps | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
Command failed on vpm003 with status 1: 'sudo stop ceph-all || sudo service ceph stop || sudo systemctl stop ceph.target' |
||||||||||||||
dead | 2084234 |
![]() |
2018-01-17 23:21:52 | 2018-01-17 23:21:57 | 2018-01-17 23:41:56 | 0:19:59 | vps | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | ||
dead | 2043132 |
![]() ![]() |
2018-01-08 18:03:49 | 2018-01-08 18:04:05 | 2018-01-08 18:36:03 | 0:31:58 | 0:25:43 | 0:06:15 | vps | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |
Failure Reason:
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 217, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 101, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 225, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 123, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 57, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 217, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 101, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 67, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 249, in represent_undefined raise RepresenterError("cannot represent an object: %s" % data)RepresenterError: cannot represent an object: libboost-thread1.58.0vpm003.front.sepia.ceph.com: {changed: false, msg: "Failed to connect to the host\ \ via ssh: ssh: connect to host vpm003.front.sepia.ceph.com port 22: No route\ \ to host\r\n", unreachable: true}vpm003.front.sepia.ceph.com: {changed: false, msg: "Failed to connect to the host\ \ via ssh: ssh: connect to host vpm003.front.sepia.ceph.com port 22: No route\ \ to host\r\n", unreachable: true} |
||||||||||||||
dead | 2040657 |
![]() |
2018-01-08 02:25:36 | 2018-01-08 02:25:39 | 2018-01-08 14:28:09 | 12:02:30 | vps | master | centos | 7.4 | upgrade:luminous-x/parallel/{0-cluster/{openstack.yaml start.yaml} 1-ceph-install/luminous.yaml 2-workload/{blogbench.yaml ec-rados-default.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 5-final-workload/{blogbench.yaml rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_latest.yaml objectstore/bluestore.yaml} | 3 | ||
pass | 2040288 |
![]() |
2018-01-08 01:15:33 | 2018-01-08 01:15:44 | 2018-01-08 01:39:44 | 0:24:00 | 0:19:07 | 0:04:53 | vps | master | ubuntu | 14.04 | upgrade:hammer-x/v0-94-4-stop/{distros/centos_latest.yaml distros/ubuntu_14.04.yaml ignore.yaml v0-94-4-stop.yaml} | 2 |
dead | 2038691 |
![]() ![]() |
2018-01-07 04:23:54 | 2018-01-07 10:37:10 | 2018-01-07 18:27:20 | 7:50:10 | 3:51:32 | 3:58:38 | vps | master | ubuntu | 16.04 | upgrade:jewel-x/parallel/{0-cluster/{openstack.yaml start.yaml} 1-jewel-install/jewel.yaml 1.5-final-scrub.yaml 2-workload/test_rbd_python.yaml 3-upgrade-sequence/upgrade-all.yaml 4-luminous.yaml 5-workload.yaml 6-luminous-with-mgr.yaml 6.5-crush-compat.yaml 7-final-workload/{blogbench.yaml rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} 8-jewel-workload.yaml distros/ubuntu_latest.yaml} | 4 |
Failure Reason:
Command failed (workunit test rados/test-upgrade-v11.0.0.sh) on vpm003 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=jewel TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/rados/test-upgrade-v11.0.0.sh' |