User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-03-19 07:00:03 | 2019-03-19 08:11:29 | 2019-03-19 13:29:47 | 5:18:18 | smoke | master | ovh | 01abe61 | 16 | 12 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3746707 | 2019-03-19 07:00:24 | 2019-03-19 08:09:28 | 2019-03-19 08:15:27 | 0:05:59 | 0:02:44 | 0:03:15 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh001.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh001', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
pass | 3746708 | 2019-03-19 07:00:24 | 2019-03-19 08:11:29 | 2019-03-19 09:37:30 | 1:26:01 | 0:35:04 | 0:50:57 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
fail | 3746709 | 2019-03-19 07:00:25 | 2019-03-19 08:15:41 | 2019-03-19 09:33:41 | 1:18:00 | 0:21:20 | 0:56:40 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
pass | 3746710 | 2019-03-19 07:00:26 | 2019-03-19 08:19:47 | 2019-03-19 09:45:47 | 1:26:00 | 0:41:54 | 0:44:06 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 3746711 | 2019-03-19 07:00:26 | 2019-03-19 08:27:34 | 2019-03-19 09:51:35 | 1:24:01 | 0:27:28 | 0:56:33 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/iozone.sh) on ovh074 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=01abe61539e7f86cce59115c30c1ca63113ceb68 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/iozone.sh' |
||||||||||||||
fail | 3746712 | 2019-03-19 07:00:27 | 2019-03-19 08:31:16 | 2019-03-19 09:45:16 | 1:14:00 | 0:28:44 | 0:45:16 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on ovh058 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=01abe61539e7f86cce59115c30c1ca63113ceb68 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
pass | 3746713 | 2019-03-19 07:00:28 | 2019-03-19 08:55:39 | 2019-03-19 10:17:40 | 1:22:01 | 0:29:25 | 0:52:36 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
fail | 3746714 | 2019-03-19 07:00:28 | 2019-03-19 09:13:34 | 2019-03-19 10:25:34 | 1:12:00 | 0:26:19 | 0:45:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/dbench.sh) on ovh055 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=01abe61539e7f86cce59115c30c1ca63113ceb68 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/dbench.sh' |
||||||||||||||
fail | 3746715 | 2019-03-19 07:00:29 | 2019-03-19 09:25:46 | 2019-03-19 10:49:47 | 1:24:01 | 0:42:14 | 0:41:47 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
"2019-03-19 10:31:15.453345 mon.a (mon.0) 122 : cluster [WRN] Health check failed: 1 MDSs report slow metadata IOs (MDS_SLOW_METADATA_IO)" in cluster log |
||||||||||||||
fail | 3746716 | 2019-03-19 07:00:30 | 2019-03-19 09:29:37 | 2019-03-19 09:41:37 | 0:12:00 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |||
Failure Reason:
Command failed on ovh029 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_ceph_g963185290e83-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/963185290e83bd83e553a6022953e579619deef7/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3746717 | 2019-03-19 07:00:30 | 2019-03-19 09:33:45 | 2019-03-19 10:53:45 | 1:20:00 | 0:34:16 | 0:45:44 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
pass | 3746718 | 2019-03-19 07:00:31 | 2019-03-19 09:37:44 | 2019-03-19 11:01:44 | 1:24:00 | 0:40:08 | 0:43:52 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |
pass | 3746719 | 2019-03-19 07:00:32 | 2019-03-19 09:41:40 | 2019-03-19 11:11:40 | 1:30:00 | 0:40:58 | 0:49:02 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
pass | 3746720 | 2019-03-19 07:00:32 | 2019-03-19 09:45:31 | 2019-03-19 11:15:31 | 1:30:00 | 0:48:27 | 0:41:33 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
pass | 3746721 | 2019-03-19 07:00:33 | 2019-03-19 09:45:48 | 2019-03-19 12:07:50 | 2:22:02 | 1:37:11 | 0:44:51 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
fail | 3746722 | 2019-03-19 07:00:34 | 2019-03-19 09:51:38 | 2019-03-19 10:17:37 | 0:25:59 | 0:03:17 | 0:22:42 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh028.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh028', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh081.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh081', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh068.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh068', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh074.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh074', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3746723 | 2019-03-19 07:00:34 | 2019-03-19 10:09:24 | 2019-03-19 10:21:23 | 0:11:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |||
Failure Reason:
Command failed on ovh016 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_ceph_g963185290e83-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/963185290e83bd83e553a6022953e579619deef7/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3746724 | 2019-03-19 07:00:35 | 2019-03-19 10:17:40 | 2019-03-19 11:59:41 | 1:42:01 | 0:41:23 | 1:00:38 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
pass | 3746725 | 2019-03-19 07:00:36 | 2019-03-19 10:17:41 | 2019-03-19 11:37:41 | 1:20:00 | 0:33:02 | 0:46:58 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
fail | 3746726 | 2019-03-19 07:00:36 | 2019-03-19 10:21:37 | 2019-03-19 10:37:37 | 0:16:00 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |||
Failure Reason:
Command failed on ovh055 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_ceph_g963185290e83-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://3.chacra.ceph.com/r/kernel/testing/963185290e83bd83e553a6022953e579619deef7/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3746727 | 2019-03-19 07:00:37 | 2019-03-19 10:25:40 | 2019-03-19 12:39:41 | 2:14:01 | 1:21:20 | 0:52:41 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
pass | 3746728 | 2019-03-19 07:00:38 | 2019-03-19 10:37:40 | 2019-03-19 11:57:40 | 1:20:00 | 0:27:14 | 0:52:46 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
pass | 3746729 | 2019-03-19 07:00:38 | 2019-03-19 10:49:55 | 2019-03-19 12:17:55 | 1:28:00 | 0:42:33 | 0:45:27 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
pass | 3746730 | 2019-03-19 07:00:39 | 2019-03-19 10:53:56 | 2019-03-19 12:23:57 | 1:30:01 | 0:44:00 | 0:46:01 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
fail | 3746731 | 2019-03-19 07:00:40 | 2019-03-19 11:01:57 | 2019-03-19 12:09:58 | 1:08:01 | 0:28:17 | 0:39:44 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/iozone.sh) on ovh052 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=01abe61539e7f86cce59115c30c1ca63113ceb68 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/iozone.sh' |
||||||||||||||
pass | 3746732 | 2019-03-19 07:00:41 | 2019-03-19 11:11:54 | 2019-03-19 13:19:55 | 2:08:01 | 1:17:19 | 0:50:42 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
pass | 3746733 | 2019-03-19 07:00:41 | 2019-03-19 11:15:45 | 2019-03-19 13:29:47 | 2:14:02 | 1:18:38 | 0:55:24 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
fail | 3746734 | 2019-03-19 07:00:42 | 2019-03-19 11:37:45 | 2019-03-19 12:51:45 | 1:14:00 | 0:27:53 | 0:46:07 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh063 with status 1: 'cd /home/ubuntu/cephtest/swift && ./bootstrap' |