User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-01-11 07:00:03 | 2019-01-11 07:03:17 | 2019-01-11 09:37:22 | 2:34:05 | smoke | master | ovh | 5dfcb55 | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3446857 | 2019-01-11 07:00:24 | 2019-01-11 07:03:17 | 2019-01-11 07:17:16 | 0:13:59 | 0:02:44 | 0:11:15 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh040.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh040', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3446858 | 2019-01-11 07:00:24 | 2019-01-11 07:08:39 | 2019-01-11 08:08:38 | 0:59:59 | 0:06:58 | 0:53:01 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh004 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446859 | 2019-01-11 07:00:25 | 2019-01-11 07:15:21 | 2019-01-11 09:37:22 | 2:22:01 | 0:19:32 | 2:02:29 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
fail | 3446860 | 2019-01-11 07:00:26 | 2019-01-11 07:15:21 | 2019-01-11 08:11:21 | 0:56:00 | 0:06:57 | 0:49:03 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh093 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446861 | 2019-01-11 07:00:27 | 2019-01-11 07:15:56 | 2019-01-11 08:23:56 | 1:08:00 | 0:06:52 | 1:01:08 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh009 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446862 | 2019-01-11 07:00:27 | 2019-01-11 07:17:22 | 2019-01-11 08:13:22 | 0:56:00 | 0:06:51 | 0:49:09 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh091 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446863 | 2019-01-11 07:00:28 | 2019-01-11 07:19:17 | 2019-01-11 08:17:17 | 0:58:00 | 0:06:50 | 0:51:10 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh058 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446864 | 2019-01-11 07:00:29 | 2019-01-11 07:27:12 | 2019-01-11 08:57:13 | 1:30:01 | 0:06:47 | 1:23:14 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh004 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446865 | 2019-01-11 07:00:29 | 2019-01-11 07:27:16 | 2019-01-11 08:33:16 | 1:06:00 | 0:06:36 | 0:59:24 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh064 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446866 | 2019-01-11 07:00:30 | 2019-01-11 07:28:50 | 2019-01-11 08:40:50 | 1:12:00 | 0:06:58 | 1:05:02 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh089 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446867 | 2019-01-11 07:00:31 | 2019-01-11 07:31:14 | 2019-01-11 08:27:14 | 0:56:00 | 0:06:49 | 0:49:11 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh045 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446868 | 2019-01-11 07:00:32 | 2019-01-11 07:41:22 | 2019-01-11 08:51:22 | 1:10:00 | 0:06:42 | 1:03:18 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh051 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446869 | 2019-01-11 07:00:32 | 2019-01-11 07:49:11 | 2019-01-11 08:55:12 | 1:06:01 | 0:07:11 | 0:58:50 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh024 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446870 | 2019-01-11 07:00:33 | 2019-01-11 07:53:26 | 2019-01-11 09:01:26 | 1:08:00 | 0:06:29 | 1:01:31 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh033 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446871 | 2019-01-11 07:00:34 | 2019-01-11 08:03:09 | 2019-01-11 08:55:09 | 0:52:00 | 0:06:53 | 0:45:07 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh036 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446872 | 2019-01-11 07:00:34 | 2019-01-11 08:08:50 | 2019-01-11 08:26:50 | 0:18:00 | 0:02:54 | 0:15:06 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh034.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh034', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh058.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh058', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh006.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh006', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh042.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh042', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3446873 | 2019-01-11 07:00:35 | 2019-01-11 08:08:57 | 2019-01-11 09:20:57 | 1:12:00 | 0:07:23 | 1:04:37 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh087 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446874 | 2019-01-11 07:00:36 | 2019-01-11 08:11:11 | 2019-01-11 08:59:11 | 0:48:00 | 0:06:32 | 0:41:28 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh069 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446875 | 2019-01-11 07:00:37 | 2019-01-11 08:11:32 | 2019-01-11 09:21:32 | 1:10:00 | 0:07:04 | 1:02:56 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh009 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446876 | 2019-01-11 07:00:38 | 2019-01-11 08:13:24 | 2019-01-11 09:13:24 | 1:00:00 | 0:06:37 | 0:53:23 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh049 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446877 | 2019-01-11 07:00:38 | 2019-01-11 08:14:51 | 2019-01-11 09:12:51 | 0:58:00 | 0:07:13 | 0:50:47 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh065 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446878 | 2019-01-11 07:00:39 | 2019-01-11 08:15:10 | 2019-01-11 09:09:10 | 0:54:00 | 0:06:46 | 0:47:14 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh001 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446879 | 2019-01-11 07:00:40 | 2019-01-11 08:17:17 | 2019-01-11 09:25:17 | 1:08:00 | 0:06:44 | 1:01:16 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh053 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446880 | 2019-01-11 07:00:41 | 2019-01-11 08:17:18 | 2019-01-11 09:11:19 | 0:54:01 | 0:07:26 | 0:46:35 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh067 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446881 | 2019-01-11 07:00:41 | 2019-01-11 08:17:22 | 2019-01-11 09:23:22 | 1:06:00 | 0:06:43 | 0:59:17 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh060 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446882 | 2019-01-11 07:00:42 | 2019-01-11 08:19:00 | 2019-01-11 09:19:00 | 1:00:00 | 0:07:09 | 0:52:51 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh099 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446883 | 2019-01-11 07:00:43 | 2019-01-11 08:23:58 | 2019-01-11 09:33:59 | 1:10:01 | 0:06:47 | 1:03:14 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3446884 | 2019-01-11 07:00:44 | 2019-01-11 08:25:17 | 2019-01-11 09:13:17 | 0:48:00 | 0:06:43 | 0:41:17 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh058 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |