User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-03-03 00:00:03 | 2019-03-03 00:04:28 | 2019-03-03 04:13:03 | 4:08:35 | smoke | master | ovh | e117299 | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3661132 | 2019-03-03 00:00:23 | 2019-03-03 00:04:28 | 2019-03-03 00:16:27 | 0:11:59 | 0:02:47 | 0:09:12 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh067.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh067', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3661133 | 2019-03-03 00:00:24 | 2019-03-03 00:04:53 | 2019-03-03 00:56:53 | 0:52:00 | 0:06:25 | 0:45:35 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661134 | 2019-03-03 00:00:24 | 2019-03-03 00:10:36 | 2019-03-03 01:32:36 | 1:22:00 | 0:18:33 | 1:03:27 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
fail | 3661135 | 2019-03-03 00:00:25 | 2019-03-03 00:12:23 | 2019-03-03 01:08:23 | 0:56:00 | 0:05:56 | 0:50:04 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661136 | 2019-03-03 00:00:26 | 2019-03-03 00:16:31 | 2019-03-03 01:20:31 | 1:04:00 | 0:06:04 | 0:57:56 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh059 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661137 | 2019-03-03 00:00:27 | 2019-03-03 00:20:36 | 2019-03-03 01:08:36 | 0:48:00 | 0:06:18 | 0:41:42 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661138 | 2019-03-03 00:00:28 | 2019-03-03 00:24:26 | 2019-03-03 01:20:26 | 0:56:00 | 0:06:01 | 0:49:59 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh081 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661139 | 2019-03-03 00:00:28 | 2019-03-03 00:32:26 | 2019-03-03 01:30:26 | 0:58:00 | 0:06:05 | 0:51:55 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh057 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661140 | 2019-03-03 00:00:29 | 2019-03-03 00:56:55 | 2019-03-03 01:46:55 | 0:50:00 | 0:06:17 | 0:43:43 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661141 | 2019-03-03 00:00:30 | 2019-03-03 01:08:40 | 2019-03-03 01:56:40 | 0:48:00 | 0:06:05 | 0:41:55 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh047 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661142 | 2019-03-03 00:00:31 | 2019-03-03 01:08:40 | 2019-03-03 02:00:40 | 0:52:00 | 0:06:13 | 0:45:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh080 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661143 | 2019-03-03 00:00:32 | 2019-03-03 01:20:39 | 2019-03-03 02:18:39 | 0:58:00 | 0:05:54 | 0:52:06 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh081 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661144 | 2019-03-03 00:00:32 | 2019-03-03 01:20:39 | 2019-03-03 02:14:39 | 0:54:00 | 0:06:00 | 0:48:00 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661145 | 2019-03-03 00:00:33 | 2019-03-03 01:30:40 | 2019-03-03 02:26:40 | 0:56:00 | 0:05:53 | 0:50:07 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh023 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661146 | 2019-03-03 00:00:34 | 2019-03-03 01:32:50 | 2019-03-03 02:26:50 | 0:54:00 | 0:06:00 | 0:48:00 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh029 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661147 | 2019-03-03 00:00:35 | 2019-03-03 01:46:58 | 2019-03-03 02:06:58 | 0:20:00 | 0:02:54 | 0:17:06 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh054.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh054', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh001.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh001', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh042.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh042', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh083.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh083', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3661148 | 2019-03-03 00:00:36 | 2019-03-03 01:56:42 | 2019-03-03 02:46:42 | 0:50:00 | 0:06:12 | 0:43:48 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh055 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661149 | 2019-03-03 00:00:37 | 2019-03-03 02:00:53 | 2019-03-03 02:52:53 | 0:52:00 | 0:06:18 | 0:45:42 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh076 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661150 | 2019-03-03 00:00:37 | 2019-03-03 02:07:01 | 2019-03-03 02:53:01 | 0:46:00 | 0:05:56 | 0:40:04 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh042 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661151 | 2019-03-03 00:00:38 | 2019-03-03 02:14:50 | 2019-03-03 03:04:50 | 0:50:00 | 0:05:47 | 0:44:13 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh067 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661152 | 2019-03-03 00:00:39 | 2019-03-03 02:18:53 | 2019-03-03 03:18:53 | 1:00:00 | 0:06:06 | 0:53:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh081 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661153 | 2019-03-03 00:00:40 | 2019-03-03 02:26:43 | 2019-03-03 03:22:43 | 0:56:00 | 0:06:06 | 0:49:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh057 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661154 | 2019-03-03 00:00:40 | 2019-03-03 02:26:52 | 2019-03-03 03:20:51 | 0:53:59 | 0:05:54 | 0:48:05 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661155 | 2019-03-03 00:00:41 | 2019-03-03 02:46:58 | 2019-03-03 03:32:58 | 0:46:00 | 0:06:06 | 0:39:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661156 | 2019-03-03 00:00:42 | 2019-03-03 02:52:56 | 2019-03-03 03:46:56 | 0:54:00 | 0:06:04 | 0:47:56 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh042 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661157 | 2019-03-03 00:00:43 | 2019-03-03 02:53:02 | 2019-03-03 03:41:02 | 0:48:00 | 0:06:12 | 0:41:48 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh076 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661158 | 2019-03-03 00:00:43 | 2019-03-03 03:04:52 | 2019-03-03 04:00:52 | 0:56:00 | 0:05:53 | 0:50:07 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh068 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3661159 | 2019-03-03 00:00:44 | 2019-03-03 03:19:03 | 2019-03-03 04:13:03 | 0:54:00 | 0:06:00 | 0:48:00 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh023 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |