User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-04-21 00:00:04 | 2019-04-21 00:20:28 | 2019-04-21 12:23:51 | 12:03:23 | smoke | master | ovh | 755e8c4 | 27 | 1 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3871511 | 2019-04-21 00:00:32 | 2019-04-21 00:12:25 | 2019-04-21 00:20:24 | 0:07:59 | 0:03:34 | 0:04:25 | ovh | master | ubuntu | 18.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh048.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh048', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3871512 | 2019-04-21 00:00:33 | 2019-04-21 00:20:28 | 2019-04-21 01:08:28 | 0:48:00 | 0:06:02 | 0:41:58 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh011 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
dead | 3871513 | 2019-04-21 00:00:34 | 2019-04-21 00:21:40 | 2019-04-21 12:23:51 | 12:02:11 | ovh | master | centos | 7.6 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | — | |||
fail | 3871514 | 2019-04-21 00:00:34 | 2019-04-21 00:22:05 | 2019-04-21 01:22:05 | 1:00:00 | 0:06:19 | 0:53:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh030 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871515 | 2019-04-21 00:00:35 | 2019-04-21 00:22:16 | 2019-04-21 01:16:16 | 0:54:00 | 0:06:22 | 0:47:38 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh078 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871516 | 2019-04-21 00:00:36 | 2019-04-21 00:26:11 | 2019-04-21 01:12:11 | 0:46:00 | 0:05:57 | 0:40:03 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh061 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871517 | 2019-04-21 00:00:36 | 2019-04-21 00:29:51 | 2019-04-21 01:21:51 | 0:52:00 | 0:06:00 | 0:46:00 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871518 | 2019-04-21 00:00:37 | 2019-04-21 01:00:08 | 2019-04-21 01:54:08 | 0:54:00 | 0:06:34 | 0:47:26 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh023 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871519 | 2019-04-21 00:00:38 | 2019-04-21 01:08:31 | 2019-04-21 02:02:31 | 0:54:00 | 0:05:56 | 0:48:04 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871520 | 2019-04-21 00:00:39 | 2019-04-21 01:12:14 | 2019-04-21 02:02:14 | 0:50:00 | 0:06:10 | 0:43:50 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh003 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871521 | 2019-04-21 00:00:39 | 2019-04-21 01:16:20 | 2019-04-21 02:08:20 | 0:52:00 | 0:06:13 | 0:45:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh030 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871522 | 2019-04-21 00:00:40 | 2019-04-21 01:22:06 | 2019-04-21 02:12:06 | 0:50:00 | 0:06:10 | 0:43:50 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh078 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871523 | 2019-04-21 00:00:41 | 2019-04-21 01:22:06 | 2019-04-21 02:18:06 | 0:56:00 | 0:06:02 | 0:49:58 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh049 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871524 | 2019-04-21 00:00:42 | 2019-04-21 01:44:13 | 2019-04-21 02:36:13 | 0:52:00 | 0:06:13 | 0:45:47 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh090 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871525 | 2019-04-21 00:00:43 | 2019-04-21 01:54:17 | 2019-04-21 02:44:16 | 0:49:59 | 0:05:59 | 0:44:00 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871526 | 2019-04-21 00:00:44 | 2019-04-21 02:02:18 | 2019-04-21 02:28:17 | 0:25:59 | 0:04:02 | 0:21:57 | ovh | master | ubuntu | 18.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh010.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh010', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh079.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh079', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh069.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh069', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh024.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh024', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3871527 | 2019-04-21 00:00:44 | 2019-04-21 02:02:32 | 2019-04-21 02:52:32 | 0:50:00 | 0:06:19 | 0:43:41 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh003 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871528 | 2019-04-21 00:00:45 | 2019-04-21 02:08:24 | 2019-04-21 02:58:24 | 0:50:00 | 0:06:05 | 0:43:55 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh030 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871529 | 2019-04-21 00:00:46 | 2019-04-21 02:12:18 | 2019-04-21 03:02:18 | 0:50:00 | 0:06:08 | 0:43:52 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh054 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871530 | 2019-04-21 00:00:47 | 2019-04-21 02:18:10 | 2019-04-21 03:12:10 | 0:54:00 | 0:06:07 | 0:47:53 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh098 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871531 | 2019-04-21 00:00:47 | 2019-04-21 02:28:21 | 2019-04-21 03:28:22 | 1:00:01 | 0:06:27 | 0:53:34 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh079 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871532 | 2019-04-21 00:00:48 | 2019-04-21 02:36:22 | 2019-04-21 03:28:27 | 0:52:05 | 0:06:02 | 0:46:03 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh051 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871533 | 2019-04-21 00:00:49 | 2019-04-21 02:44:20 | 2019-04-21 03:38:20 | 0:54:00 | 0:06:03 | 0:47:57 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh088 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871534 | 2019-04-21 00:00:50 | 2019-04-21 02:52:38 | 2019-04-21 03:44:43 | 0:52:05 | 0:06:03 | 0:46:02 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871535 | 2019-04-21 00:00:51 | 2019-04-21 02:58:29 | 2019-04-21 03:50:29 | 0:52:00 | 0:06:20 | 0:45:40 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh055 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871536 | 2019-04-21 00:00:51 | 2019-04-21 03:02:21 | 2019-04-21 03:54:21 | 0:52:00 | 0:06:33 | 0:45:27 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh054 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871537 | 2019-04-21 00:00:52 | 2019-04-21 03:12:25 | 2019-04-21 04:06:25 | 0:54:00 | 0:06:06 | 0:47:54 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh049 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3871538 | 2019-04-21 00:00:53 | 2019-04-21 03:28:27 | 2019-04-21 04:26:27 | 0:58:00 | 0:06:17 | 0:51:43 | ovh | master | rhel | 7.4 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh010 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |