User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-03-31 00:00:03 | 2019-03-31 00:00:31 | 2019-03-31 13:49:32 | 13:49:01 | smoke | master | ovh | fc46584 | 27 | 1 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3790745 | 2019-03-31 00:00:27 | 2019-03-31 00:00:31 | 2019-03-31 00:14:30 | 0:13:59 | 0:04:03 | 0:09:56 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh099.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh099', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3790746 | 2019-03-31 00:00:28 | 2019-03-31 00:00:32 | 2019-03-31 00:58:32 | 0:58:00 | 0:06:58 | 0:51:02 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh047 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790747 | 2019-03-31 00:00:28 | 2019-03-31 00:00:31 | 2019-03-31 01:20:32 | 1:20:01 | 0:21:02 | 0:58:59 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
fail | 3790748 | 2019-03-31 00:00:29 | 2019-03-31 00:00:32 | 2019-03-31 00:58:37 | 0:58:05 | 0:06:40 | 0:51:25 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh016 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790749 | 2019-03-31 00:00:30 | 2019-03-31 00:00:31 | 2019-03-31 00:52:31 | 0:52:00 | 0:06:33 | 0:45:27 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790750 | 2019-03-31 00:00:31 | 2019-03-31 00:00:32 | 2019-03-31 00:10:31 | 0:09:59 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | — | |||
Failure Reason:
'ssh_keyscan ovh048.front.sepia.ceph.com' reached maximum tries (5) after waiting for 5 seconds |
||||||||||||||
fail | 3790751 | 2019-03-31 00:00:31 | 2019-03-31 00:00:33 | 2019-03-31 01:04:33 | 1:04:00 | 0:06:39 | 0:57:21 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh058 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790752 | 2019-03-31 00:00:32 | 2019-03-31 00:00:33 | 2019-03-31 01:02:33 | 1:02:00 | 0:06:45 | 0:55:15 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh078 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790753 | 2019-03-31 00:00:33 | 2019-03-31 00:00:34 | 2019-03-31 00:54:34 | 0:54:00 | 0:06:45 | 0:47:15 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh076 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790754 | 2019-03-31 00:00:34 | 2019-03-31 00:00:35 | 2019-03-31 00:58:35 | 0:58:00 | 0:06:58 | 0:51:02 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh093 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790755 | 2019-03-31 00:00:34 | 2019-03-31 00:10:36 | 2019-03-31 01:08:36 | 0:58:00 | 0:06:39 | 0:51:21 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh065 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790756 | 2019-03-31 00:00:35 | 2019-03-31 00:14:34 | 2019-03-31 01:18:34 | 1:04:00 | 0:06:52 | 0:57:08 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh027 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790757 | 2019-03-31 00:00:36 | 2019-03-31 00:52:36 | 2019-03-31 01:44:36 | 0:52:00 | 0:06:29 | 0:45:31 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790758 | 2019-03-31 00:00:36 | 2019-03-31 00:54:49 | 2019-03-31 01:40:49 | 0:46:00 | 0:06:15 | 0:39:45 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh016 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790759 | 2019-03-31 00:00:37 | 2019-03-31 00:58:48 | 2019-03-31 01:42:47 | 0:43:59 | 0:06:17 | 0:37:42 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh042 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790760 | 2019-03-31 00:00:38 | 2019-03-31 00:58:48 | 2019-03-31 01:22:47 | 0:23:59 | 0:03:26 | 0:20:33 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh035.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh035', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh057.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh057', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh094.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh094', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh059.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh059', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3790761 | 2019-03-31 00:00:39 | 2019-03-31 00:58:48 | 2019-03-31 01:46:48 | 0:48:00 | 0:06:29 | 0:41:31 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh055 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790762 | 2019-03-31 00:00:39 | 2019-03-31 01:02:47 | 2019-03-31 01:56:47 | 0:54:00 | 0:06:27 | 0:47:33 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh089 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790763 | 2019-03-31 00:00:40 | 2019-03-31 01:04:36 | 2019-03-31 02:00:36 | 0:56:00 | 0:06:27 | 0:49:33 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh054 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790764 | 2019-03-31 00:00:41 | 2019-03-31 01:08:39 | 2019-03-31 02:06:39 | 0:58:00 | 0:06:26 | 0:51:34 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh049 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790765 | 2019-03-31 00:00:42 | 2019-03-31 01:18:52 | 2019-03-31 02:04:52 | 0:46:00 | 0:06:21 | 0:39:39 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh061 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790766 | 2019-03-31 00:00:42 | 2019-03-31 01:20:35 | 2019-03-31 02:10:34 | 0:49:59 | 0:06:21 | 0:43:38 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh001 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790767 | 2019-03-31 00:00:43 | 2019-03-31 01:22:51 | 2019-03-31 02:18:51 | 0:56:00 | 0:06:29 | 0:49:31 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh045 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790768 | 2019-03-31 00:00:44 | 2019-03-31 01:40:54 | 2019-03-31 02:24:53 | 0:43:59 | 0:06:29 | 0:37:30 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790769 | 2019-03-31 00:00:45 | 2019-03-31 01:43:04 | 2019-03-31 02:41:04 | 0:58:00 | 0:06:20 | 0:51:40 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3790770 | 2019-03-31 00:00:45 | 2019-03-31 01:44:40 | 2019-03-31 02:28:39 | 0:43:59 | 0:06:42 | 0:37:17 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh042 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
dead | 3790771 | 2019-03-31 00:00:46 | 2019-03-31 01:47:07 | 2019-03-31 13:49:32 | 12:02:25 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |||
fail | 3790772 | 2019-03-31 00:00:47 | 2019-03-31 01:57:03 | 2019-03-31 02:51:03 | 0:54:00 | 0:06:10 | 0:47:50 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh089 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |