Name Machine Type Up Locked Locked Since Locked By OS Type OS Version Arch Description
ovh061.front.sepia.ceph.com ovh True False x86_64 None
Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
fail 3927762 2019-05-04 21:35:56 2019-05-05 01:01:11 2019-05-05 01:41:11 0:40:00 0:16:02 0:23:58 ovh master rhel 7.5 rgw/hadoop-s3a/{hadoop/v28.yaml s3a-hadoop.yaml supported-random-distro$/{rhel_latest.yaml}} 4
Failure Reason:

Command failed on ovh080 with status 127: 'cd ~/ceph-ansible ; virtualenv --system-site-packages venv ; source venv/bin/activate ; pip install --upgrade pip ; pip install setuptools>=11.3 notario>=0.0.13 netaddr ansible==2.5 ; ANSIBLE_STDOUT_CALLBACK=debug ansible-playbook -vv -i inven.yml site.yml'

pass 3920799 2019-05-03 05:01:14 2019-05-04 04:46:26 2019-05-04 05:34:26 0:48:00 0:28:45 0:19:15 ovh master ubuntu 16.04 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} 3
pass 3920793 2019-05-03 05:01:10 2019-05-04 03:40:09 2019-05-04 04:28:09 0:48:00 0:37:36 0:10:24 ovh master ubuntu 16.04 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} 3
pass 3920788 2019-05-03 05:01:06 2019-05-04 02:59:55 2019-05-04 03:39:55 0:40:00 0:24:19 0:15:41 ovh master ubuntu 16.04 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} 3
pass 3918377 2019-05-02 07:02:27 2019-05-02 15:27:24 2019-05-02 16:19:24 0:52:00 0:24:38 0:27:22 ovh master centos 7.4 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} 3
fail 3918369 2019-05-02 07:00:55 2019-05-02 14:13:20 2019-05-02 15:27:20 1:14:00 0:30:13 0:43:47 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} 3
Failure Reason:

Command failed on ovh061 with status 1: 'cd /home/ubuntu/cephtest/swift && ./bootstrap'

pass 3918363 2019-05-02 07:00:50 2019-05-02 12:57:03 2019-05-02 14:13:03 1:16:00 0:32:57 0:43:03 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} 3
pass 3918353 2019-05-02 07:00:43 2019-05-02 11:26:47 2019-05-02 12:56:48 1:30:01 0:48:17 0:41:44 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} 3
pass 3918345 2019-05-02 07:00:36 2019-05-02 09:54:40 2019-05-02 11:26:40 1:32:00 0:48:03 0:43:57 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} 3
pass 3917918 2019-05-02 05:02:36 2019-05-02 07:35:10 2019-05-02 08:35:10 1:00:00 0:30:02 0:29:58 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} 3
pass 3917908 2019-05-02 05:02:28 2019-05-02 05:48:17 2019-05-02 06:42:17 0:54:00 0:25:24 0:28:36 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} 3
pass 3917811 2019-05-02 05:00:44 2019-05-04 01:24:22 2019-05-04 02:02:22 0:38:00 0:22:15 0:15:45 ovh master ubuntu 16.04 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} 3
pass 3915146 2019-05-01 07:02:43 2019-05-01 19:36:10 2019-05-01 20:12:10 0:36:00 0:22:28 0:13:32 ovh master centos 7.4 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} 3
fail 3915135 2019-05-01 07:02:37 2019-05-01 18:46:20 2019-05-01 19:28:20 0:42:00 0:23:38 0:18:22 ovh master ubuntu 18.04 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
Failure Reason:

ceph-deploy: Failed to zap osds

fail 3915016 2019-05-01 07:00:52 2019-05-01 16:11:47 2019-05-01 18:37:48 2:26:01 1:39:33 0:46:28 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} 3
Failure Reason:

Command failed (s3 tests against rgw) on ovh095 with status 1: "S3TEST_CONF=/home/ubuntu/cephtest/archive/s3-tests.client.0.conf BOTO_CONFIG=/home/ubuntu/cephtest/boto.cfg REQUESTS_CA_BUNDLE=/etc/pki/tls/certs/ca-bundle.crt /home/ubuntu/cephtest/s3-tests/virtualenv/bin/nosetests -w /home/ubuntu/cephtest/s3-tests -v -a '!fails_on_rgw,!lifecycle_expiration,!fails_strict_rfc2616'"

fail 3915015 2019-05-01 07:00:52 2019-05-01 16:05:30 2019-05-01 16:11:29 0:05:59 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} 3
Failure Reason:

Command failed on ovh083 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc4_ceph_gbc1c00c70b18-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/bc1c00c70b1807a97aae4cbd167a28bd4adb7727/centos/7/flavors/default/x86_64/ --input-file=-'

pass 3915008 2019-05-01 07:00:46 2019-05-01 14:43:12 2019-05-01 16:05:13 1:22:01 0:36:12 0:45:49 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} 3
fail 3915005 2019-05-01 07:00:44 2019-05-01 14:25:01 2019-05-01 14:43:00 0:17:59 0:04:20 0:13:39 ovh master ubuntu 18.04 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
Failure Reason:

{'ovh003.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh003', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh073.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh073', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh034.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh034', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh061.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh061', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}}

pass 3914998 2019-05-01 07:00:39 2019-05-01 12:56:44 2019-05-01 14:24:45 1:28:01 0:43:51 0:44:10 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} 3
fail 3914997 2019-05-01 07:00:38 2019-05-01 12:50:29 2019-05-01 12:56:28 0:05:59 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} 3
Failure Reason:

Command failed on ovh083 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc4_ceph_gbc1c00c70b18-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/bc1c00c70b1807a97aae4cbd167a28bd4adb7727/centos/7/flavors/default/x86_64/ --input-file=-'