User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2018-12-01 07:00:03 | 2018-12-01 07:07:51 | 2018-12-01 09:24:14 | 2:16:23 | smoke | master | ovh | 6d64cf5 | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3300507 | 2018-12-01 07:00:28 | 2018-12-01 07:04:15 | 2018-12-01 07:16:14 | 0:11:59 | 0:02:43 | 0:09:16 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh040.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh040', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3300508 | 2018-12-01 07:00:29 | 2018-12-01 07:07:51 | 2018-12-01 08:05:51 | 0:58:00 | 0:06:14 | 0:51:46 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh092 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300509 | 2018-12-01 07:00:30 | 2018-12-01 07:09:43 | 2018-12-01 08:11:43 | 1:02:00 | 0:03:02 | 0:58:58 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh001.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543651748.83-114421667578191/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}, 'ovh076.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543651764.81-159846218807621/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}, 'ovh051.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543651766.14-39567716091701/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}, 'ovh063.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543651768.83-12480548697761/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}} |
||||||||||||||
fail | 3300510 | 2018-12-01 07:00:30 | 2018-12-01 07:12:18 | 2018-12-01 08:04:18 | 0:52:00 | 0:06:31 | 0:45:29 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh012 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300511 | 2018-12-01 07:00:31 | 2018-12-01 07:13:37 | 2018-12-01 08:01:37 | 0:48:00 | 0:06:08 | 0:41:52 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh072 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300512 | 2018-12-01 07:00:32 | 2018-12-01 07:15:59 | 2018-12-01 08:07:59 | 0:52:00 | 0:06:18 | 0:45:42 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh022 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300513 | 2018-12-01 07:00:33 | 2018-12-01 07:15:59 | 2018-12-01 08:13:59 | 0:58:00 | 0:06:34 | 0:51:26 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh009 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300514 | 2018-12-01 07:00:34 | 2018-12-01 07:16:15 | 2018-12-01 08:20:16 | 1:04:01 | 0:06:18 | 0:57:43 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh011 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300515 | 2018-12-01 07:00:35 | 2018-12-01 07:18:03 | 2018-12-01 08:14:03 | 0:56:00 | 0:06:17 | 0:49:43 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300516 | 2018-12-01 07:00:36 | 2018-12-01 07:21:58 | 2018-12-01 08:35:58 | 1:14:00 | 0:06:24 | 1:07:36 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh045 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300517 | 2018-12-01 07:00:37 | 2018-12-01 07:27:14 | 2018-12-01 08:23:14 | 0:56:00 | 0:06:19 | 0:49:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh084 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300518 | 2018-12-01 07:00:38 | 2018-12-01 07:32:59 | 2018-12-01 08:38:59 | 1:06:00 | 0:06:24 | 0:59:36 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh015 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300519 | 2018-12-01 07:00:39 | 2018-12-01 07:38:28 | 2018-12-01 08:32:28 | 0:54:00 | 0:06:23 | 0:47:37 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh064 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300520 | 2018-12-01 07:00:39 | 2018-12-01 07:40:31 | 2018-12-01 08:44:31 | 1:04:00 | 0:06:28 | 0:57:32 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh100 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300521 | 2018-12-01 07:00:40 | 2018-12-01 07:42:22 | 2018-12-01 09:00:22 | 1:18:00 | 0:06:06 | 1:11:54 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh029 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300522 | 2018-12-01 07:00:41 | 2018-12-01 07:44:22 | 2018-12-01 08:12:22 | 0:28:00 | 0:02:51 | 0:25:09 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh031.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh031', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh072.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh072', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh070.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh070', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh069.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh069', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3300523 | 2018-12-01 07:00:42 | 2018-12-01 07:51:50 | 2018-12-01 08:45:50 | 0:54:00 | 0:06:12 | 0:47:48 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300524 | 2018-12-01 07:00:43 | 2018-12-01 08:01:50 | 2018-12-01 08:53:50 | 0:52:00 | 0:06:26 | 0:45:34 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh088 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300525 | 2018-12-01 07:00:43 | 2018-12-01 08:01:52 | 2018-12-01 08:59:52 | 0:58:00 | 0:06:10 | 0:51:50 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh033 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300526 | 2018-12-01 07:00:44 | 2018-12-01 08:04:30 | 2018-12-01 08:56:30 | 0:52:00 | 0:06:28 | 0:45:32 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh009 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300527 | 2018-12-01 07:00:45 | 2018-12-01 08:06:05 | 2018-12-01 09:06:05 | 1:00:00 | 0:06:20 | 0:53:40 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh037 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300528 | 2018-12-01 07:00:46 | 2018-12-01 08:08:13 | 2018-12-01 09:12:14 | 1:04:01 | 0:06:35 | 0:57:26 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh053 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300529 | 2018-12-01 07:00:46 | 2018-12-01 08:09:53 | 2018-12-01 08:57:53 | 0:48:00 | 0:06:13 | 0:41:47 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300530 | 2018-12-01 07:00:47 | 2018-12-01 08:10:26 | 2018-12-01 08:58:26 | 0:48:00 | 0:06:14 | 0:41:46 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh041 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300531 | 2018-12-01 07:00:48 | 2018-12-01 08:11:57 | 2018-12-01 09:05:57 | 0:54:00 | 0:06:17 | 0:47:43 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh031 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300532 | 2018-12-01 07:00:49 | 2018-12-01 08:11:57 | 2018-12-01 09:01:57 | 0:50:00 | 0:06:18 | 0:43:42 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh069 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300533 | 2018-12-01 07:00:50 | 2018-12-01 08:12:23 | 2018-12-01 09:22:23 | 1:10:00 | 0:06:53 | 1:03:07 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh082 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3300534 | 2018-12-01 07:00:51 | 2018-12-01 08:14:14 | 2018-12-01 09:24:14 | 1:10:00 | 0:06:19 | 1:03:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh067 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |