User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2018-11-29 07:00:03 | 2018-11-29 07:14:57 | 2018-11-29 10:21:12 | 3:06:15 | smoke | master | ovh | 590babe | 28 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3292177 | 2018-11-29 07:00:27 | 2018-11-29 07:01:07 | 2018-11-29 08:11:07 | 1:10:00 | 0:02:39 | 1:07:21 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh082.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh082', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3292178 | 2018-11-29 07:00:28 | 2018-11-29 07:03:25 | 2018-11-29 08:25:26 | 1:22:01 | 0:06:20 | 1:15:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh080 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292179 | 2018-11-29 07:00:28 | 2018-11-29 07:06:29 | 2018-11-29 09:18:30 | 2:12:01 | 0:03:15 | 2:08:46 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh053.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543482154.56-277407753503752/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}, 'ovh054.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543482150.09-249165465496188/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}, 'ovh045.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543482151.41-218543591496396/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}, 'ovh012.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'directory_mode': None, 'force': True, 'remote_src': None, 'dest': '/usr/lib64/nagios/plugins/check_mem.sh', 'selevel': None, 'original_basename': 'check_mem.sh', 'regexp': None, 'owner': 'root', 'follow': False, 'validate': None, 'local_follow': None, 'src': '/home/ubuntu/.ansible/tmp/ansible-tmp-1543482164.77-10881780049557/source', 'group': 'root', 'unsafe_writes': None, 'delimiter': None, 'seuser': None, 'serole': None, 'content': None, 'setype': None, 'mode': 493, 'attributes': None, 'backup': False}}, '_ansible_no_log': False, 'diff': [], 'msg': 'Destination directory /usr/lib64/nagios/plugins does not exist', 'checksum': '39df629b10ded370443e2e4c84d690332c95104d', 'changed': False}} |
||||||||||||||
fail | 3292180 | 2018-11-29 07:00:29 | 2018-11-29 07:10:52 | 2018-11-29 08:34:53 | 1:24:01 | 0:06:20 | 1:17:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh096 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292181 | 2018-11-29 07:00:30 | 2018-11-29 07:14:49 | 2018-11-29 08:12:49 | 0:58:00 | 0:06:11 | 0:51:49 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh028 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292182 | 2018-11-29 07:00:31 | 2018-11-29 07:14:57 | 2018-11-29 08:08:57 | 0:54:00 | 0:06:10 | 0:47:50 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh088 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292183 | 2018-11-29 07:00:31 | 2018-11-29 07:16:35 | 2018-11-29 08:24:36 | 1:08:01 | 0:06:20 | 1:01:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
Failure Reason:
Command failed on ovh070 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292184 | 2018-11-29 07:00:32 | 2018-11-29 07:22:14 | 2018-11-29 08:46:14 | 1:24:00 | 0:06:22 | 1:17:38 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on ovh050 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292185 | 2018-11-29 07:00:33 | 2018-11-29 07:24:47 | 2018-11-29 09:04:47 | 1:40:00 | 0:06:07 | 1:33:53 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on ovh009 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292186 | 2018-11-29 07:00:34 | 2018-11-29 07:46:37 | 2018-11-29 08:54:37 | 1:08:00 | 0:06:16 | 1:01:44 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on ovh087 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292187 | 2018-11-29 07:00:34 | 2018-11-29 07:51:01 | 2018-11-29 08:47:01 | 0:56:00 | 0:06:25 | 0:49:35 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh023 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292188 | 2018-11-29 07:00:35 | 2018-11-29 07:54:17 | 2018-11-29 08:52:17 | 0:58:00 | 0:06:17 | 0:51:43 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/mon_thrash.yaml} | 3 | |
Failure Reason:
Command failed on ovh085 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292189 | 2018-11-29 07:00:36 | 2018-11-29 07:54:32 | 2018-11-29 09:42:33 | 1:48:01 | 0:06:20 | 1:41:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh039 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292190 | 2018-11-29 07:00:37 | 2018-11-29 08:09:11 | 2018-11-29 09:05:11 | 0:56:00 | 0:06:10 | 0:49:50 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_bench.yaml} | 3 | |
Failure Reason:
Command failed on ovh089 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292191 | 2018-11-29 07:00:37 | 2018-11-29 08:10:40 | 2018-11-29 09:22:41 | 1:12:01 | 0:06:25 | 1:05:36 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh032 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292192 | 2018-11-29 07:00:38 | 2018-11-29 08:11:08 | 2018-11-29 08:45:08 | 0:34:00 | 0:03:32 | 0:30:28 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh034.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh034', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh070.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh070', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh057.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh057', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh095.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh095', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3292193 | 2018-11-29 07:00:39 | 2018-11-29 08:12:59 | 2018-11-29 09:23:00 | 1:10:01 | 0:06:13 | 1:03:48 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed on ovh083 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292194 | 2018-11-29 07:00:40 | 2018-11-29 08:17:50 | 2018-11-29 09:29:51 | 1:12:01 | 0:06:12 | 1:05:49 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_ec_snaps.yaml} | 3 | |
Failure Reason:
Command failed on ovh055 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292195 | 2018-11-29 07:00:40 | 2018-11-29 08:24:50 | 2018-11-29 09:58:51 | 1:34:01 | 0:06:16 | 1:27:45 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_python.yaml} | 3 | |
Failure Reason:
Command failed on ovh090 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292196 | 2018-11-29 07:00:41 | 2018-11-29 08:25:27 | 2018-11-29 09:33:27 | 1:08:00 | 0:06:23 | 1:01:37 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
Failure Reason:
Command failed on ovh031 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292197 | 2018-11-29 07:00:42 | 2018-11-29 08:26:37 | 2018-11-29 09:48:38 | 1:22:01 | 0:06:15 | 1:15:46 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh038 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292198 | 2018-11-29 07:00:42 | 2018-11-29 08:28:51 | 2018-11-29 09:32:51 | 1:04:00 | 0:06:31 | 0:57:29 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
Failure Reason:
Command failed on ovh076 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292199 | 2018-11-29 07:00:43 | 2018-11-29 08:35:07 | 2018-11-29 09:39:07 | 1:04:00 | 0:05:43 | 0:58:17 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_fsx.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292200 | 2018-11-29 07:00:44 | 2018-11-29 08:45:11 | 2018-11-29 10:21:12 | 1:36:01 | 0:06:34 | 1:29:27 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh098 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292201 | 2018-11-29 07:00:45 | 2018-11-29 08:46:29 | 2018-11-29 10:10:29 | 1:24:00 | 0:06:18 | 1:17:42 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed on ovh048 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292202 | 2018-11-29 07:00:45 | 2018-11-29 08:47:02 | 2018-11-29 09:41:02 | 0:54:00 | 0:05:52 | 0:48:08 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh061 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292203 | 2018-11-29 07:00:46 | 2018-11-29 08:47:49 | 2018-11-29 09:49:49 | 1:02:00 | 0:06:12 | 0:55:48 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_s3tests.yaml} | 3 | |
Failure Reason:
Command failed on ovh010 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |
||||||||||||||
fail | 3292204 | 2018-11-29 07:00:47 | 2018-11-29 08:52:30 | 2018-11-29 09:56:31 | 1:04:01 | 0:06:14 | 0:57:47 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh006 with status 1: '\n sudo yum -y install ceph-radosgw\n ' |