User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-02-10 00:00:03 | 2019-02-10 14:52:33 | 2019-02-10 17:36:38 | 2:44:05 | smoke | master | ovh | b4fa473 | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3569615 | 2019-02-10 00:00:19 | 2019-02-10 14:52:33 | 2019-02-10 15:04:32 | 0:11:59 | 0:02:33 | 0:09:26 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh039.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh039', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3569616 | 2019-02-10 00:00:19 | 2019-02-10 14:52:34 | 2019-02-10 15:50:34 | 0:58:00 | 0:05:52 | 0:52:08 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh068 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569617 | 2019-02-10 00:00:20 | 2019-02-10 14:52:33 | 2019-02-10 16:14:34 | 1:22:01 | 0:19:47 | 1:02:14 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
fail | 3569618 | 2019-02-10 00:00:21 | 2019-02-10 14:52:33 | 2019-02-10 16:02:33 | 1:10:00 | 0:06:04 | 1:03:56 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh053 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569619 | 2019-02-10 00:00:21 | 2019-02-10 14:52:33 | 2019-02-10 15:52:33 | 1:00:00 | 0:06:00 | 0:54:00 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh099 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569620 | 2019-02-10 00:00:22 | 2019-02-10 14:52:34 | 2019-02-10 15:46:33 | 0:53:59 | 0:05:52 | 0:48:07 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh043 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569621 | 2019-02-10 00:00:22 | 2019-02-10 14:52:33 | 2019-02-10 15:50:33 | 0:58:00 | 0:05:55 | 0:52:05 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569622 | 2019-02-10 00:00:23 | 2019-02-10 14:52:33 | 2019-02-10 15:50:33 | 0:58:00 | 0:05:59 | 0:52:01 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh055 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569623 | 2019-02-10 00:00:23 | 2019-02-10 14:52:34 | 2019-02-10 15:50:34 | 0:58:00 | 0:05:52 | 0:52:08 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh066 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569624 | 2019-02-10 00:00:24 | 2019-02-10 14:52:34 | 2019-02-10 15:42:34 | 0:50:00 | 0:05:50 | 0:44:10 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh073 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569625 | 2019-02-10 00:00:25 | 2019-02-10 15:04:33 | 2019-02-10 15:56:33 | 0:52:00 | 0:06:17 | 0:45:43 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh082 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569626 | 2019-02-10 00:00:25 | 2019-02-10 15:42:35 | 2019-02-10 16:28:35 | 0:46:00 | 0:05:54 | 0:40:06 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh073 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569627 | 2019-02-10 00:00:26 | 2019-02-10 15:46:36 | 2019-02-10 16:32:35 | 0:45:59 | 0:05:42 | 0:40:17 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh066 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569628 | 2019-02-10 00:00:26 | 2019-02-10 15:50:35 | 2019-02-10 16:42:35 | 0:52:00 | 0:06:10 | 0:45:50 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh095 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569629 | 2019-02-10 00:00:27 | 2019-02-10 15:50:36 | 2019-02-10 16:36:35 | 0:45:59 | 0:05:54 | 0:40:05 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh042 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569630 | 2019-02-10 00:00:27 | 2019-02-10 15:50:36 | 2019-02-10 16:06:35 | 0:15:59 | 0:02:50 | 0:13:09 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh044.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh044', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh043.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh043', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh051.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh051', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh016.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh016', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3569631 | 2019-02-10 00:00:28 | 2019-02-10 15:50:36 | 2019-02-10 16:38:35 | 0:47:59 | 0:06:07 | 0:41:52 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh080 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569632 | 2019-02-10 00:00:29 | 2019-02-10 15:52:35 | 2019-02-10 16:44:35 | 0:52:00 | 0:06:00 | 0:46:00 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh029 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569633 | 2019-02-10 00:00:29 | 2019-02-10 15:56:37 | 2019-02-10 16:50:37 | 0:54:00 | 0:05:57 | 0:48:03 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569634 | 2019-02-10 00:00:30 | 2019-02-10 16:02:45 | 2019-02-10 17:04:45 | 1:02:00 | 0:06:07 | 0:55:53 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh082 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569635 | 2019-02-10 00:00:30 | 2019-02-10 16:06:38 | 2019-02-10 16:54:38 | 0:48:00 | 0:06:19 | 0:41:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh051 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569636 | 2019-02-10 00:00:31 | 2019-02-10 16:14:46 | 2019-02-10 17:00:45 | 0:45:59 | 0:06:18 | 0:39:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh076 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569637 | 2019-02-10 00:00:32 | 2019-02-10 16:28:37 | 2019-02-10 17:14:37 | 0:46:00 | 0:05:53 | 0:40:07 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh061 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569638 | 2019-02-10 00:00:32 | 2019-02-10 16:32:38 | 2019-02-10 17:18:38 | 0:46:00 | 0:05:41 | 0:40:19 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh066 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569639 | 2019-02-10 00:00:33 | 2019-02-10 16:36:38 | 2019-02-10 17:24:38 | 0:48:00 | 0:06:04 | 0:41:56 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569640 | 2019-02-10 00:00:33 | 2019-02-10 16:38:38 | 2019-02-10 17:24:38 | 0:46:00 | 0:05:48 | 0:40:12 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh009 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569641 | 2019-02-10 00:00:34 | 2019-02-10 16:42:47 | 2019-02-10 17:32:47 | 0:50:00 | 0:05:43 | 0:44:17 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh089 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3569642 | 2019-02-10 00:00:34 | 2019-02-10 16:44:38 | 2019-02-10 17:36:38 | 0:52:00 | 0:06:05 | 0:45:55 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh099 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |