User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-01-20 00:00:03 | 2019-01-20 00:01:18 | 2019-01-20 13:49:53 | 13:48:35 | smoke | master | ovh | 92ea0a6 | 27 | 1 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3484854 | 2019-01-20 00:01:04 | 2019-01-20 00:01:17 | 2019-01-20 00:09:16 | 0:07:59 | 0:03:04 | 0:04:55 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh097.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh097', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3484855 | 2019-01-20 00:01:05 | 2019-01-20 00:01:18 | 2019-01-20 01:07:18 | 1:06:00 | 0:06:12 | 0:59:48 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484856 | 2019-01-20 00:01:06 | 2019-01-20 00:01:18 | 2019-01-20 01:21:18 | 1:20:00 | 0:18:59 | 1:01:01 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed to zap osds |
||||||||||||||
fail | 3484857 | 2019-01-20 00:01:06 | 2019-01-20 00:01:17 | 2019-01-20 01:03:17 | 1:02:00 | 0:06:03 | 0:55:57 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh064 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484858 | 2019-01-20 00:01:07 | 2019-01-20 00:01:18 | 2019-01-20 00:57:18 | 0:56:00 | 0:06:24 | 0:49:36 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh051 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484859 | 2019-01-20 00:01:08 | 2019-01-20 00:01:19 | 2019-01-20 00:57:18 | 0:55:59 | 0:06:06 | 0:49:53 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh061 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484860 | 2019-01-20 00:01:09 | 2019-01-20 00:01:18 | 2019-01-20 00:59:18 | 0:58:00 | 0:06:16 | 0:51:44 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh024 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484861 | 2019-01-20 00:01:09 | 2019-01-20 00:01:18 | 2019-01-20 01:01:17 | 0:59:59 | 0:06:12 | 0:53:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh084 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484862 | 2019-01-20 00:01:10 | 2019-01-20 00:01:18 | 2019-01-20 01:07:17 | 1:05:59 | 0:06:12 | 0:59:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh001 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484863 | 2019-01-20 00:01:11 | 2019-01-20 00:01:18 | 2019-01-20 01:03:18 | 1:02:00 | 0:06:06 | 0:55:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484864 | 2019-01-20 00:01:12 | 2019-01-20 00:09:21 | 2019-01-20 00:57:26 | 0:48:05 | 0:06:22 | 0:41:43 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh045 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484865 | 2019-01-20 00:01:12 | 2019-01-20 00:57:27 | 2019-01-20 01:43:26 | 0:45:59 | 0:06:15 | 0:39:44 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh097 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484866 | 2019-01-20 00:01:13 | 2019-01-20 00:57:27 | 2019-01-20 01:47:27 | 0:50:00 | 0:06:11 | 0:43:49 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh068 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484867 | 2019-01-20 00:01:14 | 2019-01-20 00:57:27 | 2019-01-20 01:45:27 | 0:48:00 | 0:06:13 | 0:41:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh061 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484868 | 2019-01-20 00:01:15 | 2019-01-20 00:59:30 | 2019-01-20 01:47:30 | 0:48:00 | 0:06:09 | 0:41:51 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484869 | 2019-01-20 00:01:16 | 2019-01-20 01:01:30 | 2019-01-20 01:19:29 | 0:17:59 | 0:02:54 | 0:15:05 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh047.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh047', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh037.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh037', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh070.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh070', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh024.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh024', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3484870 | 2019-01-20 00:01:16 | 2019-01-20 01:03:20 | 2019-01-20 01:57:20 | 0:54:00 | 0:06:07 | 0:47:53 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484871 | 2019-01-20 00:01:17 | 2019-01-20 01:03:20 | 2019-01-20 01:55:20 | 0:52:00 | 0:06:07 | 0:45:53 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh093 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484872 | 2019-01-20 00:01:18 | 2019-01-20 01:07:28 | 2019-01-20 01:59:28 | 0:52:00 | 0:06:16 | 0:45:44 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh045 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484873 | 2019-01-20 00:01:19 | 2019-01-20 01:07:29 | 2019-01-20 02:11:29 | 1:04:00 | 0:06:04 | 0:57:56 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh028 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484874 | 2019-01-20 00:01:20 | 2019-01-20 01:19:43 | 2019-01-20 02:07:43 | 0:48:00 | 0:06:23 | 0:41:37 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh070 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484875 | 2019-01-20 00:01:20 | 2019-01-20 01:21:31 | 2019-01-20 02:15:31 | 0:54:00 | 0:05:56 | 0:48:04 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh029 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484876 | 2019-01-20 00:01:21 | 2019-01-20 01:43:41 | 2019-01-20 02:31:41 | 0:48:00 | 0:06:35 | 0:41:25 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484877 | 2019-01-20 00:01:22 | 2019-01-20 01:45:39 | 2019-01-20 02:33:39 | 0:48:00 | 0:06:14 | 0:41:46 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh003 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
dead | 3484878 | 2019-01-20 00:01:23 | 2019-01-20 01:47:29 | 2019-01-20 13:49:53 | 12:02:24 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |||
fail | 3484879 | 2019-01-20 00:01:23 | 2019-01-20 01:47:31 | 2019-01-20 02:33:31 | 0:46:00 | 0:06:17 | 0:39:43 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh051 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484880 | 2019-01-20 00:01:24 | 2019-01-20 01:55:32 | 2019-01-20 02:47:32 | 0:52:00 | 0:06:18 | 0:45:42 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh047 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3484881 | 2019-01-20 00:01:25 | 2019-01-20 01:57:32 | 2019-01-20 02:51:32 | 0:54:00 | 0:06:09 | 0:47:51 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |