User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-03-17 00:00:03 | 2019-03-17 03:14:10 | 2019-03-17 06:43:31 | 3:29:21 | smoke | master | ovh | 04dad6b | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3737192 | 2019-03-17 00:00:21 | 2019-03-17 03:14:10 | 2019-03-17 03:28:09 | 0:13:59 | 0:03:48 | 0:10:11 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh029.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh029', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3737193 | 2019-03-17 00:00:22 | 2019-03-17 03:14:59 | 2019-03-17 04:04:59 | 0:50:00 | 0:05:50 | 0:44:10 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh070 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737194 | 2019-03-17 00:00:22 | 2019-03-17 03:28:22 | 2019-03-17 04:48:23 | 1:20:01 | 0:19:37 | 1:00:24 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
fail | 3737195 | 2019-03-17 00:00:23 | 2019-03-17 03:29:19 | 2019-03-17 04:17:19 | 0:48:00 | 0:05:43 | 0:42:17 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh055 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737196 | 2019-03-17 00:00:24 | 2019-03-17 03:35:38 | 2019-03-17 04:27:38 | 0:52:00 | 0:05:43 | 0:46:17 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh023 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737197 | 2019-03-17 00:00:24 | 2019-03-17 03:37:20 | 2019-03-17 04:33:20 | 0:56:00 | 0:05:30 | 0:50:30 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh057 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737198 | 2019-03-17 00:00:25 | 2019-03-17 03:46:20 | 2019-03-17 04:34:20 | 0:48:00 | 0:05:30 | 0:42:30 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh076 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737199 | 2019-03-17 00:00:26 | 2019-03-17 04:00:42 | 2019-03-17 04:54:42 | 0:54:00 | 0:06:16 | 0:47:44 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh042 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737200 | 2019-03-17 00:00:26 | 2019-03-17 04:05:04 | 2019-03-17 05:03:04 | 0:58:00 | 0:06:23 | 0:51:37 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh080 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737201 | 2019-03-17 00:00:27 | 2019-03-17 04:05:04 | 2019-03-17 04:57:04 | 0:52:00 | 0:06:24 | 0:45:36 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh052 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737202 | 2019-03-17 00:00:28 | 2019-03-17 04:05:06 | 2019-03-17 04:59:06 | 0:54:00 | 0:06:19 | 0:47:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737203 | 2019-03-17 00:00:28 | 2019-03-17 04:17:23 | 2019-03-17 05:03:23 | 0:46:00 | 0:05:28 | 0:40:32 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737204 | 2019-03-17 00:00:29 | 2019-03-17 04:27:52 | 2019-03-17 05:17:52 | 0:50:00 | 0:05:32 | 0:44:28 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh059 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737205 | 2019-03-17 00:00:30 | 2019-03-17 04:33:34 | 2019-03-17 05:27:33 | 0:53:59 | 0:05:37 | 0:48:22 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh057 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737206 | 2019-03-17 00:00:31 | 2019-03-17 04:34:21 | 2019-03-17 05:20:21 | 0:46:00 | 0:05:43 | 0:40:17 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh016 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737207 | 2019-03-17 00:00:31 | 2019-03-17 04:48:36 | 2019-03-17 05:08:35 | 0:19:59 | 0:03:23 | 0:16:36 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh029.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh029', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh001.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh001', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh042.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh042', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh090.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh090', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3737208 | 2019-03-17 00:00:32 | 2019-03-17 04:54:56 | 2019-03-17 05:40:55 | 0:45:59 | 0:05:44 | 0:40:15 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737209 | 2019-03-17 00:00:33 | 2019-03-17 04:57:18 | 2019-03-17 05:47:18 | 0:50:00 | 0:05:44 | 0:44:16 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh024 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737210 | 2019-03-17 00:00:33 | 2019-03-17 04:59:20 | 2019-03-17 05:53:20 | 0:54:00 | 0:05:40 | 0:48:20 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737211 | 2019-03-17 00:00:34 | 2019-03-17 05:03:17 | 2019-03-17 05:51:17 | 0:48:00 | 0:05:51 | 0:42:09 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh001 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737212 | 2019-03-17 00:00:35 | 2019-03-17 05:03:25 | 2019-03-17 05:55:25 | 0:52:00 | 0:05:51 | 0:46:09 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737213 | 2019-03-17 00:00:35 | 2019-03-17 05:08:51 | 2019-03-17 05:58:51 | 0:50:00 | 0:05:33 | 0:44:27 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737214 | 2019-03-17 00:00:36 | 2019-03-17 05:18:03 | 2019-03-17 06:02:03 | 0:44:00 | 0:05:27 | 0:38:33 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh061 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737215 | 2019-03-17 00:00:37 | 2019-03-17 05:20:31 | 2019-03-17 06:12:31 | 0:52:00 | 0:06:08 | 0:45:52 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh059 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737216 | 2019-03-17 00:00:37 | 2019-03-17 05:27:48 | 2019-03-17 06:25:48 | 0:58:00 | 0:06:51 | 0:51:09 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh057 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737217 | 2019-03-17 00:00:38 | 2019-03-17 05:41:10 | 2019-03-17 06:31:10 | 0:50:00 | 0:06:19 | 0:43:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737218 | 2019-03-17 00:00:39 | 2019-03-17 05:47:31 | 2019-03-17 06:37:31 | 0:50:00 | 0:05:57 | 0:44:03 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh001 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3737219 | 2019-03-17 00:00:40 | 2019-03-17 05:51:31 | 2019-03-17 06:43:31 | 0:52:00 | 0:06:02 | 0:45:58 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh080 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |