Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
fail 2893742 2018-08-11 07:00:21 2018-08-11 07:33:05 2018-08-11 07:47:04 0:13:59 0:02:53 0:11:06 ovh master ubuntu 16.04 smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} 1
Failure Reason:

{'ovh006.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh006', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}}

fail 2893743 2018-08-11 07:00:22 2018-08-11 07:33:18 2018-08-11 08:35:18 1:02:00 0:06:12 0:55:48 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_blogbench.yaml} 3
Failure Reason:

Command failed on ovh017 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893744 2018-08-11 07:00:22 2018-08-11 07:37:19 2018-08-11 08:33:19 0:56:00 0:18:36 0:37:24 ovh master centos 7.4 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
Failure Reason:

ceph-deploy: Failed to zap osds

fail 2893745 2018-08-11 07:00:23 2018-08-11 07:41:32 2018-08-11 08:53:33 1:12:01 0:06:40 1:05:21 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_fsstress.yaml} 3
Failure Reason:

Command failed on ovh087 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893746 2018-08-11 07:00:24 2018-08-11 07:41:42 2018-08-11 09:29:43 1:48:01 0:06:39 1:41:22 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_iozone.yaml} 3
Failure Reason:

Command failed on ovh089 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893747 2018-08-11 07:00:24 2018-08-11 07:47:15 2018-08-11 08:41:15 0:54:00 0:06:06 0:47:54 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_pjd.yaml} 3
Failure Reason:

Command failed on ovh080 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893748 2018-08-11 07:00:25 2018-08-11 07:47:18 2018-08-11 08:55:19 1:08:01 0:06:12 1:01:49 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_direct_io.yaml} 3
Failure Reason:

Command failed on ovh039 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893749 2018-08-11 07:00:25 2018-08-11 07:51:38 2018-08-11 09:21:38 1:30:00 0:06:16 1:23:44 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_dbench.yaml} 3
Failure Reason:

Command failed on ovh017 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893750 2018-08-11 07:00:26 2018-08-11 07:52:09 2018-08-11 08:54:10 1:02:01 0:06:17 0:55:44 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_fsstress.yaml} 3
Failure Reason:

Command failed on ovh011 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893751 2018-08-11 07:00:27 2018-08-11 07:55:42 2018-08-11 09:19:43 1:24:01 0:18:00 1:06:01 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_pjd.yaml} 3
Failure Reason:

{'ovh090.front.sepia.ceph.com': {'msg': 'Timeout (122s) waiting for privilege escalation prompt: '}}

fail 2893752 2018-08-11 07:00:27 2018-08-11 07:57:31 2018-08-11 09:39:32 1:42:01 0:06:16 1:35:45 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/libcephfs_interface_tests.yaml} 3
Failure Reason:

Command failed on ovh095 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893753 2018-08-11 07:00:28 2018-08-11 07:57:31 2018-08-11 09:25:32 1:28:01 0:06:42 1:21:19 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/mon_thrash.yaml} 3
Failure Reason:

Command failed on ovh008 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893754 2018-08-11 07:00:29 2018-08-11 08:00:43 2018-08-11 09:10:44 1:10:01 0:06:30 1:03:31 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_api_tests.yaml} 3
Failure Reason:

Command failed on ovh049 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893755 2018-08-11 07:00:29 2018-08-11 08:07:40 2018-08-11 08:59:41 0:52:01 0:06:19 0:45:42 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_bench.yaml} 3
Failure Reason:

Command failed on ovh036 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893756 2018-08-11 07:00:30 2018-08-11 08:09:19 2018-08-11 10:07:20 1:58:01 0:06:16 1:51:45 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cache_snaps.yaml} 3
Failure Reason:

Command failed on ovh054 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893757 2018-08-11 07:00:30 2018-08-11 08:13:24 2018-08-11 09:35:25 1:22:01 0:02:59 1:19:02 ovh master ubuntu 16.04 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
Failure Reason:

{'ovh017.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh017', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh036.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh036', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh073.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh073', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh080.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh080', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}}

fail 2893758 2018-08-11 07:00:31 2018-08-11 08:20:20 2018-08-11 09:28:20 1:08:00 0:06:53 1:01:07 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cls_all.yaml} 3
Failure Reason:

Command failed on ovh072 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893759 2018-08-11 07:00:32 2018-08-11 08:21:17 2018-08-11 09:47:17 1:26:00 0:06:21 1:19:39 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_ec_snaps.yaml} 3
Failure Reason:

Command failed on ovh083 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893760 2018-08-11 07:00:32 2018-08-11 08:25:39 2018-08-11 09:21:39 0:56:00 0:06:13 0:49:47 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_python.yaml} 3
Failure Reason:

Command failed on ovh016 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893761 2018-08-11 07:00:33 2018-08-11 08:26:05 2018-08-11 09:40:06 1:14:01 0:06:26 1:07:35 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_workunit_loadgen_mix.yaml} 3
Failure Reason:

Command failed on ovh082 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893762 2018-08-11 07:00:34 2018-08-11 08:27:20 2018-08-11 09:57:21 1:30:01 0:06:23 1:23:38 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_api_tests.yaml} 3
Failure Reason:

Command failed on ovh013 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893763 2018-08-11 07:00:34 2018-08-11 08:29:33 2018-08-11 09:33:33 1:04:00 0:06:09 0:57:51 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_cli_import_export.yaml} 3
Failure Reason:

Command failed on ovh058 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893764 2018-08-11 07:00:35 2018-08-11 08:30:13 2018-08-11 09:24:13 0:54:00 0:06:25 0:47:35 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_fsx.yaml} 3
Failure Reason:

Command failed on ovh039 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893765 2018-08-11 07:00:36 2018-08-11 08:33:13 2018-08-11 09:19:13 0:46:00 0:06:12 0:39:48 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_python_api_tests.yaml} 3
Failure Reason:

Command failed on ovh061 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893766 2018-08-11 07:00:36 2018-08-11 08:33:20 2018-08-11 10:27:21 1:54:01 0:06:14 1:47:47 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_workunit_suites_iozone.yaml} 3
Failure Reason:

Command failed on ovh003 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893767 2018-08-11 07:00:37 2018-08-11 08:35:29 2018-08-11 10:05:30 1:30:01 0:06:19 1:23:42 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_ec_s3tests.yaml} 3
Failure Reason:

Command failed on ovh016 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893768 2018-08-11 07:00:37 2018-08-11 08:37:58 2018-08-11 09:41:58 1:04:00 0:06:28 0:57:32 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_s3tests.yaml} 3
Failure Reason:

Command failed on ovh069 with status 1: "sudo yum -y install '' ceph-radosgw"

fail 2893769 2018-08-11 07:00:38 2018-08-11 08:39:38 2018-08-11 09:55:38 1:16:00 0:06:16 1:09:44 ovh master rhel 7.5 smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_swift.yaml} 3
Failure Reason:

Command failed on ovh040 with status 1: "sudo yum -y install '' ceph-radosgw"