User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-01-13 00:00:03 | 2019-01-13 00:00:38 | 2019-01-13 01:36:40 | 1:36:02 | smoke | master | ovh | 82994c4 | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3456498 | 2019-01-13 00:00:23 | 2019-01-13 00:00:38 | 2019-01-13 00:12:37 | 0:11:59 | 0:02:39 | 0:09:20 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh054.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh054', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3456499 | 2019-01-13 00:00:24 | 2019-01-13 00:00:39 | 2019-01-13 01:10:39 | 1:10:00 | 0:06:06 | 1:03:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456500 | 2019-01-13 00:00:25 | 2019-01-13 00:00:40 | 2019-01-13 01:36:40 | 1:36:00 | 0:18:37 | 1:17:23 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed to zap osds |
||||||||||||||
fail | 3456501 | 2019-01-13 00:00:25 | 2019-01-13 00:00:40 | 2019-01-13 01:06:40 | 1:06:00 | 0:06:19 | 0:59:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh074 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456502 | 2019-01-13 00:00:26 | 2019-01-13 00:00:40 | 2019-01-13 01:14:40 | 1:14:00 | 0:06:09 | 1:07:51 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh049 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456503 | 2019-01-13 00:00:27 | 2019-01-13 00:00:41 | 2019-01-13 01:16:41 | 1:16:00 | 0:06:29 | 1:09:31 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456504 | 2019-01-13 00:00:28 | 2019-01-13 00:00:39 | 2019-01-13 01:06:38 | 1:05:59 | 0:06:05 | 0:59:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh053 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456505 | 2019-01-13 00:00:29 | 2019-01-13 00:00:39 | 2019-01-13 01:14:39 | 1:14:00 | 0:06:09 | 1:07:51 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh081 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456506 | 2019-01-13 00:00:29 | 2019-01-13 00:00:39 | 2019-01-13 01:08:39 | 1:08:00 | 0:06:03 | 1:01:57 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh099 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456507 | 2019-01-13 00:00:30 | 2019-01-13 00:00:42 | 2019-01-13 01:08:41 | 1:07:59 | 0:06:17 | 1:01:42 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh038 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456508 | 2019-01-13 00:00:31 | 2019-01-13 00:00:41 | 2019-01-13 01:08:41 | 1:08:00 | 0:06:26 | 1:01:34 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456509 | 2019-01-13 00:00:32 | 2019-01-13 00:00:39 | 2019-01-13 01:12:39 | 1:12:00 | 0:06:18 | 1:05:42 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh003 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456510 | 2019-01-13 00:00:32 | 2019-01-13 00:00:41 | 2019-01-13 01:14:46 | 1:14:05 | 0:06:15 | 1:07:50 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh029 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456511 | 2019-01-13 00:00:33 | 2019-01-13 00:00:39 | 2019-01-13 01:04:39 | 1:04:00 | 0:06:09 | 0:57:51 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh020 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456512 | 2019-01-13 00:00:34 | 2019-01-13 00:00:41 | 2019-01-13 01:12:42 | 1:12:01 | 0:05:55 | 1:06:06 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh089 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456513 | 2019-01-13 00:00:35 | 2019-01-13 00:00:39 | 2019-01-13 00:20:38 | 0:19:59 | 0:02:56 | 0:17:03 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh072.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh072', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh001.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh001', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh044.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh044', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh059.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh059', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3456514 | 2019-01-13 00:00:35 | 2019-01-13 00:00:41 | 2019-01-13 01:04:41 | 1:04:00 | 0:06:02 | 0:57:58 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh006 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456515 | 2019-01-13 00:00:36 | 2019-01-13 00:00:40 | 2019-01-13 01:02:40 | 1:02:00 | 0:06:17 | 0:55:43 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh023 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456516 | 2019-01-13 00:00:37 | 2019-01-13 00:00:42 | 2019-01-13 01:08:42 | 1:08:00 | 0:06:41 | 1:01:19 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh075 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456517 | 2019-01-13 00:00:38 | 2019-01-13 00:00:40 | 2019-01-13 01:06:39 | 1:05:59 | 0:06:38 | 0:59:21 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh055 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456518 | 2019-01-13 00:00:38 | 2019-01-13 00:00:41 | 2019-01-13 01:14:40 | 1:13:59 | 0:06:09 | 1:07:50 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh028 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456519 | 2019-01-13 00:00:39 | 2019-01-13 00:00:41 | 2019-01-13 01:16:41 | 1:16:00 | 0:06:22 | 1:09:38 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh027 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456520 | 2019-01-13 00:00:40 | 2019-01-13 00:00:42 | 2019-01-13 01:02:42 | 1:02:00 | 0:05:57 | 0:56:03 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh033 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456521 | 2019-01-13 00:00:41 | 2019-01-13 00:00:42 | 2019-01-13 01:10:43 | 1:10:01 | 0:06:14 | 1:03:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh046 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456522 | 2019-01-13 00:00:42 | 2019-01-13 00:00:43 | 2019-01-13 01:14:43 | 1:14:00 | 0:06:13 | 1:07:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh096 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456523 | 2019-01-13 00:00:43 | 2019-01-13 00:00:44 | 2019-01-13 01:24:44 | 1:24:00 | 0:06:15 | 1:17:45 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh087 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456524 | 2019-01-13 00:00:43 | 2019-01-13 00:00:45 | 2019-01-13 01:14:45 | 1:14:00 | 0:06:19 | 1:07:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh056 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3456525 | 2019-01-13 00:00:44 | 2019-01-13 00:00:45 | 2019-01-13 01:06:46 | 1:06:01 | 0:06:07 | 0:59:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh001 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |