User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-04-28 00:00:02 | 2019-04-28 18:48:52 | 2019-04-29 00:21:57 | 5:33:05 | smoke | master | ovh | ab1c804 | 27 | 1 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3901320 | 2019-04-28 00:00:25 | 2019-04-28 10:51:15 | 2019-04-28 11:43:14 | 0:51:59 | 0:04:06 | 0:47:53 | ovh | master | ubuntu | 18.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh066.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh066', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3901321 | 2019-04-28 00:00:26 | 2019-04-28 11:18:08 | 2019-04-28 12:36:08 | 1:18:00 | 0:06:04 | 1:11:56 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh095 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
dead | 3901322 | 2019-04-28 00:00:27 | 2019-04-28 11:39:01 | 2019-04-28 23:41:18 | 12:02:17 | ovh | master | centos | 7.6 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | — | |||
fail | 3901323 | 2019-04-28 00:00:27 | 2019-04-28 11:43:29 | 2019-04-28 13:17:30 | 1:34:01 | 0:06:23 | 1:27:38 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh013 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901324 | 2019-04-28 00:00:28 | 2019-04-28 12:26:31 | 2019-04-28 13:58:32 | 1:32:01 | 0:06:27 | 1:25:34 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh012 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901325 | 2019-04-28 00:00:29 | 2019-04-28 12:35:06 | 2019-04-28 13:43:11 | 1:08:05 | 0:06:16 | 1:01:49 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh016 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901326 | 2019-04-28 00:00:30 | 2019-04-28 12:36:24 | 2019-04-28 14:48:25 | 2:12:01 | 0:06:09 | 2:05:52 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh085 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901327 | 2019-04-28 00:00:30 | 2019-04-28 13:05:32 | 2019-04-28 14:47:33 | 1:42:01 | 0:06:10 | 1:35:51 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh087 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901328 | 2019-04-28 00:00:31 | 2019-04-28 13:17:41 | 2019-04-28 15:53:42 | 2:36:01 | 0:06:18 | 2:29:43 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh090 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901329 | 2019-04-28 00:00:32 | 2019-04-28 13:43:30 | 2019-04-28 15:35:31 | 1:52:01 | 0:06:27 | 1:45:34 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh095 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901330 | 2019-04-28 00:00:32 | 2019-04-28 13:58:35 | 2019-04-28 17:22:43 | 3:24:08 | 0:05:57 | 3:18:11 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh095 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901331 | 2019-04-28 00:00:33 | 2019-04-28 14:13:56 | 2019-04-28 15:35:57 | 1:22:01 | 0:07:09 | 1:14:52 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh012 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901332 | 2019-04-28 00:00:34 | 2019-04-28 14:45:32 | 2019-04-28 16:31:33 | 1:46:01 | 0:06:39 | 1:39:22 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh090 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901333 | 2019-04-28 00:00:35 | 2019-04-28 14:47:37 | 2019-04-28 16:59:38 | 2:12:01 | 0:06:14 | 2:05:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh012 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901334 | 2019-04-28 00:00:35 | 2019-04-28 14:48:40 | 2019-04-28 16:20:40 | 1:32:00 | 0:06:14 | 1:25:46 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh016 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901335 | 2019-04-28 00:00:36 | 2019-04-28 15:35:49 | 2019-04-29 00:21:57 | 8:46:08 | 0:04:10 | 8:41:58 | ovh | master | ubuntu | 18.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh090.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh090', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh013.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh013', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh095.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh095', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh041.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh041', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3901336 | 2019-04-28 00:00:37 | 2019-04-28 15:35:58 | 2019-04-28 17:55:59 | 2:20:01 | 0:06:09 | 2:13:52 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh085 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901337 | 2019-04-28 00:00:37 | 2019-04-28 15:53:58 | 2019-04-28 17:03:58 | 1:10:00 | 0:06:07 | 1:03:53 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh085 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901338 | 2019-04-28 00:00:38 | 2019-04-28 16:20:46 | 2019-04-28 18:06:47 | 1:46:01 | 0:06:24 | 1:39:37 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh013 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901339 | 2019-04-28 00:00:39 | 2019-04-28 16:31:49 | 2019-04-28 21:43:53 | 5:12:04 | 0:06:14 | 5:05:50 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh041 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901340 | 2019-04-28 00:00:39 | 2019-04-28 16:59:53 | 2019-04-28 20:17:56 | 3:18:03 | 0:06:05 | 3:11:58 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh095 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901341 | 2019-04-28 00:00:40 | 2019-04-28 17:04:02 | 2019-04-28 18:12:02 | 1:08:00 | 0:05:59 | 1:02:01 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh090 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901342 | 2019-04-28 00:00:41 | 2019-04-28 17:22:48 | 2019-04-28 18:42:49 | 1:20:01 | 0:05:57 | 1:14:04 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh085 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901343 | 2019-04-28 00:00:41 | 2019-04-28 17:56:14 | 2019-04-28 19:10:14 | 1:14:00 | 0:06:36 | 1:07:24 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh087 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901344 | 2019-04-28 00:00:42 | 2019-04-28 18:06:50 | 2019-04-28 20:58:52 | 2:52:02 | 0:06:14 | 2:45:48 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh012 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901345 | 2019-04-28 00:00:43 | 2019-04-28 18:12:10 | 2019-04-28 19:32:10 | 1:20:00 | 0:06:14 | 1:13:46 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh090 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901346 | 2019-04-28 00:00:43 | 2019-04-28 18:42:50 | 2019-04-28 19:42:50 | 1:00:00 | 0:06:22 | 0:53:38 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh013 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3901347 | 2019-04-28 00:00:44 | 2019-04-28 18:48:52 | 2019-04-29 00:14:56 | 5:26:04 | 0:06:15 | 5:19:49 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh095 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |