User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-01-08 07:00:03 | 2019-01-08 07:10:13 | 2019-01-08 10:16:03 | 3:05:50 | smoke | master | ovh | d49e2e9 | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3436574 | 2019-01-08 07:00:23 | 2019-01-08 07:03:58 | 2019-01-08 07:11:57 | 0:07:59 | 0:02:39 | 0:05:20 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh076.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh076', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3436575 | 2019-01-08 07:00:24 | 2019-01-08 07:08:07 | 2019-01-08 08:10:07 | 1:02:00 | 0:07:16 | 0:54:44 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh002 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436576 | 2019-01-08 07:00:25 | 2019-01-08 07:08:21 | 2019-01-08 09:54:23 | 2:46:02 | 0:19:48 | 2:26:14 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
fail | 3436577 | 2019-01-08 07:00:25 | 2019-01-08 07:10:13 | 2019-01-08 08:04:13 | 0:54:00 | 0:06:56 | 0:47:04 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh093 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436578 | 2019-01-08 07:00:26 | 2019-01-08 07:11:57 | 2019-01-08 08:09:57 | 0:58:00 | 0:06:54 | 0:51:06 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh009 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436579 | 2019-01-08 07:00:27 | 2019-01-08 07:11:58 | 2019-01-08 08:21:58 | 1:10:00 | 0:07:03 | 1:02:57 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh068 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436580 | 2019-01-08 07:00:27 | 2019-01-08 07:12:01 | 2019-01-08 08:34:01 | 1:22:00 | 0:07:30 | 1:14:30 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh075 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436581 | 2019-01-08 07:00:28 | 2019-01-08 07:16:09 | 2019-01-08 08:34:09 | 1:18:00 | 0:06:54 | 1:11:06 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436582 | 2019-01-08 07:00:29 | 2019-01-08 07:17:01 | 2019-01-08 08:23:01 | 1:06:00 | 0:06:42 | 0:59:18 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh010 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436583 | 2019-01-08 07:00:30 | 2019-01-08 07:22:02 | 2019-01-08 08:40:02 | 1:18:00 | 0:07:08 | 1:10:52 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh037 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436584 | 2019-01-08 07:00:30 | 2019-01-08 07:23:53 | 2019-01-08 08:33:53 | 1:10:00 | 0:07:12 | 1:02:48 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436585 | 2019-01-08 07:00:31 | 2019-01-08 07:25:10 | 2019-01-08 08:45:10 | 1:20:00 | 0:06:32 | 1:13:28 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh004 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436586 | 2019-01-08 07:00:32 | 2019-01-08 07:26:05 | 2019-01-08 08:26:05 | 1:00:00 | 0:06:48 | 0:53:12 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh069 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436587 | 2019-01-08 07:00:32 | 2019-01-08 07:28:08 | 2019-01-08 08:26:08 | 0:58:00 | 0:07:00 | 0:51:00 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh089 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436588 | 2019-01-08 07:00:33 | 2019-01-08 07:28:09 | 2019-01-08 08:30:08 | 1:01:59 | 0:06:42 | 0:55:17 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh066 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436589 | 2019-01-08 07:00:34 | 2019-01-08 07:30:01 | 2019-01-08 10:16:03 | 2:46:02 | 0:02:46 | 2:43:16 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh010.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh010', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh022.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh022', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh086.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh086', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh098.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh098', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3436590 | 2019-01-08 07:00:35 | 2019-01-08 07:35:42 | 2019-01-08 08:29:42 | 0:54:00 | 0:06:33 | 0:47:27 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh091 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436591 | 2019-01-08 07:00:35 | 2019-01-08 07:36:16 | 2019-01-08 08:54:17 | 1:18:01 | 0:06:58 | 1:11:03 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh032 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436592 | 2019-01-08 07:00:36 | 2019-01-08 07:41:49 | 2019-01-08 08:45:49 | 1:04:00 | 0:07:03 | 0:56:57 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh031 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436593 | 2019-01-08 07:00:37 | 2019-01-08 07:44:01 | 2019-01-08 08:38:01 | 0:54:00 | 0:07:03 | 0:46:57 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh011 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436594 | 2019-01-08 07:00:37 | 2019-01-08 07:45:51 | 2019-01-08 08:43:51 | 0:58:00 | 0:06:41 | 0:51:19 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh050 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436595 | 2019-01-08 07:00:38 | 2019-01-08 07:45:59 | 2019-01-08 08:56:00 | 1:10:01 | 0:06:51 | 1:03:10 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh038 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436596 | 2019-01-08 07:00:39 | 2019-01-08 07:47:34 | 2019-01-08 09:19:34 | 1:32:00 | 0:06:58 | 1:25:02 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh026 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436597 | 2019-01-08 07:00:39 | 2019-01-08 07:48:33 | 2019-01-08 09:14:33 | 1:26:00 | 0:07:01 | 1:18:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh020 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436598 | 2019-01-08 07:00:40 | 2019-01-08 07:48:54 | 2019-01-08 09:04:54 | 1:16:00 | 0:06:58 | 1:09:02 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh007 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436599 | 2019-01-08 07:00:41 | 2019-01-08 07:53:53 | 2019-01-08 08:57:53 | 1:04:00 | 0:06:46 | 0:57:14 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh088 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436600 | 2019-01-08 07:00:42 | 2019-01-08 08:03:19 | 2019-01-08 10:01:20 | 1:58:01 | 0:06:59 | 1:51:02 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3436601 | 2019-01-08 07:00:42 | 2019-01-08 08:03:19 | 2019-01-08 08:57:19 | 0:54:00 | 0:06:53 | 0:47:07 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh092 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |