User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-02-19 07:00:03 | 2019-02-19 08:33:51 | 2019-02-19 13:53:18 | 5:19:27 | smoke | master | ovh | f6ca8e0 | 16 | 12 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3611173 | 2019-02-19 07:00:23 | 2019-02-19 08:33:51 | 2019-02-19 08:45:50 | 0:11:59 | 0:02:55 | 0:09:04 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh024.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh024', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
pass | 3611174 | 2019-02-19 07:00:24 | 2019-02-19 08:36:50 | 2019-02-19 10:04:51 | 1:28:01 | 0:34:04 | 0:53:57 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
fail | 3611175 | 2019-02-19 07:00:25 | 2019-02-19 08:46:04 | 2019-02-19 10:02:04 | 1:16:00 | 0:19:25 | 0:56:35 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
pass | 3611176 | 2019-02-19 07:00:26 | 2019-02-19 08:46:50 | 2019-02-19 10:14:51 | 1:28:01 | 0:38:36 | 0:49:25 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 3611177 | 2019-02-19 07:00:26 | 2019-02-19 08:47:00 | 2019-02-19 10:05:00 | 1:18:00 | 0:25:42 | 0:52:18 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/iozone.sh) on ovh044 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=f6ca8e0c4d3d9a77c655c45be979b032200b24f3 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/iozone.sh' |
||||||||||||||
fail | 3611178 | 2019-02-19 07:00:27 | 2019-02-19 08:54:35 | 2019-02-19 10:10:41 | 1:16:06 | 0:27:06 | 0:49:00 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on ovh063 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=f6ca8e0c4d3d9a77c655c45be979b032200b24f3 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
pass | 3611179 | 2019-02-19 07:00:28 | 2019-02-19 09:02:43 | 2019-02-19 10:32:44 | 1:30:01 | 0:28:15 | 1:01:46 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |
fail | 3611180 | 2019-02-19 07:00:28 | 2019-02-19 09:22:54 | 2019-02-19 10:42:54 | 1:20:00 | 0:26:04 | 0:53:56 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/dbench.sh) on ovh081 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=f6ca8e0c4d3d9a77c655c45be979b032200b24f3 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/dbench.sh' |
||||||||||||||
pass | 3611181 | 2019-02-19 07:00:29 | 2019-02-19 09:28:49 | 2019-02-19 10:56:49 | 1:28:00 | 0:37:45 | 0:50:15 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
fail | 3611182 | 2019-02-19 07:00:30 | 2019-02-19 09:41:21 | 2019-02-19 10:53:22 | 1:12:01 | 0:25:11 | 0:46:50 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on ovh068 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=f6ca8e0c4d3d9a77c655c45be979b032200b24f3 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
pass | 3611183 | 2019-02-19 07:00:30 | 2019-02-19 10:00:54 | 2019-02-19 11:30:55 | 1:30:01 | 0:33:25 | 0:56:36 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
fail | 3611184 | 2019-02-19 07:00:31 | 2019-02-19 10:02:17 | 2019-02-19 10:08:16 | 0:05:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |||
Failure Reason:
Command failed on ovh016 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3611185 | 2019-02-19 07:00:32 | 2019-02-19 10:05:04 | 2019-02-19 11:29:04 | 1:24:00 | 0:39:33 | 0:44:27 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
pass | 3611186 | 2019-02-19 07:00:32 | 2019-02-19 10:05:04 | 2019-02-19 12:29:05 | 2:24:01 | 1:28:05 | 0:55:56 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
pass | 3611187 | 2019-02-19 07:00:33 | 2019-02-19 10:08:29 | 2019-02-19 11:48:30 | 1:40:01 | 0:54:21 | 0:45:40 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
fail | 3611188 | 2019-02-19 07:00:34 | 2019-02-19 10:10:53 | 2019-02-19 10:30:53 | 0:20:00 | 0:02:52 | 0:17:08 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh039.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh039', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh048.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh048', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh070.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh070', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh047.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh047', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3611189 | 2019-02-19 07:00:34 | 2019-02-19 10:14:55 | 2019-02-19 10:28:54 | 0:13:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |||
Failure Reason:
Command failed on ovh074 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3611190 | 2019-02-19 07:00:35 | 2019-02-19 10:29:01 | 2019-02-19 12:03:02 | 1:34:01 | 0:38:47 | 0:55:14 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
pass | 3611191 | 2019-02-19 07:00:36 | 2019-02-19 10:31:04 | 2019-02-19 11:53:04 | 1:22:00 | 0:32:00 | 0:50:00 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
pass | 3611192 | 2019-02-19 07:00:36 | 2019-02-19 10:32:56 | 2019-02-19 12:08:57 | 1:36:01 | 0:39:59 | 0:56:02 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |
pass | 3611193 | 2019-02-19 07:00:37 | 2019-02-19 10:42:58 | 2019-02-19 12:46:59 | 2:04:01 | 1:01:04 | 1:02:57 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
pass | 3611194 | 2019-02-19 07:00:38 | 2019-02-19 10:53:24 | 2019-02-19 12:11:24 | 1:18:00 | 0:26:41 | 0:51:19 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |
pass | 3611195 | 2019-02-19 07:00:39 | 2019-02-19 10:56:57 | 2019-02-19 12:16:57 | 1:20:00 | 0:28:08 | 0:51:52 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |
pass | 3611196 | 2019-02-19 07:00:39 | 2019-02-19 11:29:07 | 2019-02-19 13:05:07 | 1:36:00 | 0:36:22 | 0:59:38 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
fail | 3611197 | 2019-02-19 07:00:40 | 2019-02-19 11:31:08 | 2019-02-19 12:41:08 | 1:10:00 | 0:27:11 | 0:42:49 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/iozone.sh) on ovh061 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=f6ca8e0c4d3d9a77c655c45be979b032200b24f3 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/iozone.sh' |
||||||||||||||
fail | 3611198 | 2019-02-19 07:00:41 | 2019-02-19 11:48:43 | 2019-02-19 12:06:42 | 0:17:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |||
Failure Reason:
Command failed on ovh042 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3611199 | 2019-02-19 07:00:41 | 2019-02-19 11:53:17 | 2019-02-19 13:53:18 | 2:00:01 | 1:09:48 | 0:50:13 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
fail | 3611200 | 2019-02-19 07:00:42 | 2019-02-19 12:03:04 | 2019-02-19 13:23:05 | 1:20:01 | 0:29:12 | 0:50:49 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh029 with status 1: 'cd /home/ubuntu/cephtest/swift && ./bootstrap' |