User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-04-05 07:00:03 | 2019-04-05 07:57:47 | 2019-04-05 12:55:55 | 4:58:08 | smoke | master | ovh | 9a60ae4 | 13 | 15 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3812965 | 2019-04-05 07:01:22 | 2019-04-05 07:49:45 | 2019-04-05 07:57:44 | 0:07:59 | 0:03:05 | 0:04:54 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh048.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh048', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
pass | 3812966 | 2019-04-05 07:01:23 | 2019-04-05 07:57:47 | 2019-04-05 09:15:47 | 1:18:00 | 0:36:20 | 0:41:40 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
fail | 3812967 | 2019-04-05 07:01:23 | 2019-04-05 07:57:48 | 2019-04-05 09:25:48 | 1:28:00 | 0:21:21 | 1:06:39 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
fail | 3812968 | 2019-04-05 07:01:24 | 2019-04-05 07:59:44 | 2019-04-05 08:15:43 | 0:15:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
Failure Reason:
Command failed on ovh045 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc1_ceph_g44dc3c3e1f1b-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/44dc3c3e1f1b96551e80bc7929d9fec749364c68/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3812969 | 2019-04-05 07:01:25 | 2019-04-05 08:05:42 | 2019-04-05 08:11:41 | 0:05:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |||
Failure Reason:
Command failed on ovh045 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc1_ceph_g44dc3c3e1f1b-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/44dc3c3e1f1b96551e80bc7929d9fec749364c68/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3812970 | 2019-04-05 07:01:26 | 2019-04-05 08:11:57 | 2019-04-05 09:25:57 | 1:14:00 | 0:29:48 | 0:44:12 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on ovh023 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=736d396ee0c9df05ba6120d16b1892a87d9d3d71 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
pass | 3812971 | 2019-04-05 07:01:26 | 2019-04-05 08:15:47 | 2019-04-05 09:41:47 | 1:26:00 | 0:31:10 | 0:54:50 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
fail | 3812972 | 2019-04-05 07:01:27 | 2019-04-05 08:39:45 | 2019-04-05 10:01:46 | 1:22:01 | 0:28:37 | 0:53:24 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/dbench.sh) on ovh094 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=736d396ee0c9df05ba6120d16b1892a87d9d3d71 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/dbench.sh' |
||||||||||||||
pass | 3812973 | 2019-04-05 07:01:28 | 2019-04-05 08:41:21 | 2019-04-05 10:13:21 | 1:32:00 | 0:44:17 | 0:47:43 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
fail | 3812974 | 2019-04-05 07:01:29 | 2019-04-05 08:53:45 | 2019-04-05 10:09:45 | 1:16:00 | 0:30:02 | 0:45:58 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on ovh078 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=736d396ee0c9df05ba6120d16b1892a87d9d3d71 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 3812975 | 2019-04-05 07:01:30 | 2019-04-05 09:01:47 | 2019-04-05 09:13:46 | 0:11:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |||
Failure Reason:
Command failed on ovh080 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc1_ceph_g44dc3c3e1f1b-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/44dc3c3e1f1b96551e80bc7929d9fec749364c68/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3812976 | 2019-04-05 07:01:30 | 2019-04-05 09:14:00 | 2019-04-05 09:23:59 | 0:09:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |||
Failure Reason:
Command failed on ovh099 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc1_ceph_g44dc3c3e1f1b-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/44dc3c3e1f1b96551e80bc7929d9fec749364c68/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3812977 | 2019-04-05 07:01:31 | 2019-04-05 09:15:52 | 2019-04-05 09:21:51 | 0:05:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |||
Failure Reason:
Command failed on ovh093 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc1_ceph_g44dc3c3e1f1b-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/44dc3c3e1f1b96551e80bc7929d9fec749364c68/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3812978 | 2019-04-05 07:01:32 | 2019-04-05 09:19:55 | 2019-04-05 11:53:57 | 2:34:02 | 1:14:37 | 1:19:25 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
fail | 3812979 | 2019-04-05 07:01:33 | 2019-04-05 09:21:54 | 2019-04-05 11:39:56 | 2:18:02 | 1:34:37 | 0:43:25 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
Failure Reason:
"2019-04-05 11:22:42.902911 mon.a (mon.0) 5391 : cluster [WRN] daemon mds.a is not responding, replacing it as rank 0 with standby daemon mds.b" in cluster log |
||||||||||||||
fail | 3812980 | 2019-04-05 07:01:33 | 2019-04-05 09:24:12 | 2019-04-05 09:44:12 | 0:20:00 | 0:03:19 | 0:16:41 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh047.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh047', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh066.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh066', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh054.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh054', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh093.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh093', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
pass | 3812981 | 2019-04-05 07:01:34 | 2019-04-05 09:25:51 | 2019-04-05 10:41:52 | 1:16:01 | 0:33:15 | 0:42:46 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
pass | 3812982 | 2019-04-05 07:01:35 | 2019-04-05 09:25:59 | 2019-04-05 10:57:59 | 1:32:00 | 0:39:36 | 0:52:24 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
pass | 3812983 | 2019-04-05 07:01:36 | 2019-04-05 09:35:50 | 2019-04-05 10:55:51 | 1:20:01 | 0:36:05 | 0:43:56 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
pass | 3812984 | 2019-04-05 07:01:36 | 2019-04-05 09:41:50 | 2019-04-05 11:09:51 | 1:28:01 | 0:44:02 | 0:43:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
pass | 3812985 | 2019-04-05 07:01:37 | 2019-04-05 09:44:14 | 2019-04-05 12:06:16 | 2:22:02 | 1:31:18 | 0:50:44 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
pass | 3812986 | 2019-04-05 07:01:38 | 2019-04-05 10:01:49 | 2019-04-05 11:25:49 | 1:24:00 | 0:30:31 | 0:53:29 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
pass | 3812987 | 2019-04-05 07:01:39 | 2019-04-05 10:09:48 | 2019-04-05 11:29:48 | 1:20:00 | 0:35:26 | 0:44:34 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
pass | 3812988 | 2019-04-05 07:01:40 | 2019-04-05 10:13:35 | 2019-04-05 12:07:36 | 1:54:01 | 0:44:34 | 1:09:27 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
fail | 3812989 | 2019-04-05 07:01:41 | 2019-04-05 10:42:05 | 2019-04-05 11:56:05 | 1:14:00 | 0:29:39 | 0:44:21 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/iozone.sh) on ovh023 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=736d396ee0c9df05ba6120d16b1892a87d9d3d71 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/iozone.sh' |
||||||||||||||
pass | 3812990 | 2019-04-05 07:01:42 | 2019-04-05 10:55:54 | 2019-04-05 12:55:55 | 2:00:01 | 1:16:48 | 0:43:13 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
fail | 3812991 | 2019-04-05 07:01:42 | 2019-04-05 10:58:03 | 2019-04-05 11:10:02 | 0:11:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |||
Failure Reason:
Command failed on ovh089 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.1.0_rc1_ceph_g44dc3c3e1f1b-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/44dc3c3e1f1b96551e80bc7929d9fec749364c68/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3812992 | 2019-04-05 07:01:43 | 2019-04-05 11:09:54 | 2019-04-05 12:19:54 | 1:10:00 | 0:27:18 | 0:42:42 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh093 with status 1: 'cd /home/ubuntu/cephtest/swift && ./bootstrap' |