User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2019-02-21 07:00:03 | 2019-02-21 08:25:53 | 2019-02-21 14:18:31 | 5:52:38 | smoke | master | ovh | 106c28f | 11 | 17 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 3621357 | 2019-02-21 07:00:57 | 2019-02-21 08:09:36 | 2019-02-21 08:17:35 | 0:07:59 | 0:02:53 | 0:05:06 | ovh | master | ubuntu | 16.04 | smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} | 1 | |
Failure Reason:
{'ovh080.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh080', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3621358 | 2019-02-21 07:00:58 | 2019-02-21 08:17:46 | 2019-02-21 08:25:46 | 0:08:00 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
Failure Reason:
Command failed on ovh080 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3621359 | 2019-02-21 07:00:58 | 2019-02-21 08:25:53 | 2019-02-21 09:41:59 | 1:16:06 | 0:19:33 | 0:56:33 | ovh | master | centos | 7.5 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed during gather keys |
||||||||||||||
pass | 3621360 | 2019-02-21 07:00:59 | 2019-02-21 08:41:27 | 2019-02-21 10:09:27 | 1:28:00 | 0:39:31 | 0:48:29 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 3621361 | 2019-02-21 07:01:00 | 2019-02-21 08:41:58 | 2019-02-21 09:53:58 | 1:12:00 | 0:27:05 | 0:44:55 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/iozone.sh) on ovh068 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=6c433736878fc074624cb1795958ef6f3065786e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/iozone.sh' |
||||||||||||||
fail | 3621362 | 2019-02-21 07:01:00 | 2019-02-21 08:45:31 | 2019-02-21 10:03:32 | 1:18:01 | 0:28:33 | 0:49:28 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on ovh058 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=6c433736878fc074624cb1795958ef6f3065786e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 3621363 | 2019-02-21 07:01:01 | 2019-02-21 08:51:53 | 2019-02-21 09:05:53 | 0:14:00 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} | 3 | |||
Failure Reason:
Command failed on ovh061 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3621364 | 2019-02-21 07:01:02 | 2019-02-21 09:06:03 | 2019-02-21 09:18:02 | 0:11:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} | 3 | |||
Failure Reason:
Command failed on ovh061 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3621365 | 2019-02-21 07:01:02 | 2019-02-21 09:18:07 | 2019-02-21 10:48:07 | 1:30:00 | 0:39:48 | 0:50:12 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} | 3 | |
fail | 3621366 | 2019-02-21 07:01:03 | 2019-02-21 09:23:24 | 2019-02-21 10:37:24 | 1:14:00 | 0:25:37 | 0:48:23 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on ovh063 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=6c433736878fc074624cb1795958ef6f3065786e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
pass | 3621367 | 2019-02-21 07:01:04 | 2019-02-21 09:29:32 | 2019-02-21 10:55:32 | 1:26:00 | 0:31:10 | 0:54:50 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} | 3 | |
fail | 3621368 | 2019-02-21 07:01:05 | 2019-02-21 09:42:17 | 2019-02-21 09:50:18 | 0:08:01 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} | 3 | |||
Failure Reason:
Command failed on ovh080 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3621369 | 2019-02-21 07:01:05 | 2019-02-21 09:50:23 | 2019-02-21 14:18:31 | 4:28:08 | 3:47:05 | 0:41:03 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} | 3 | |
Failure Reason:
Command failed (workunit test rados/test.sh) on ovh016 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=6c433736878fc074624cb1795958ef6f3065786e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test.sh' |
||||||||||||||
pass | 3621370 | 2019-02-21 07:01:06 | 2019-02-21 09:54:01 | 2019-02-21 11:32:02 | 1:38:01 | 0:52:22 | 0:45:39 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} | 3 | |
pass | 3621371 | 2019-02-21 07:01:07 | 2019-02-21 10:03:45 | 2019-02-21 12:03:46 | 2:00:01 | 1:03:35 | 0:56:26 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} | 3 | |
fail | 3621372 | 2019-02-21 07:01:07 | 2019-02-21 10:09:40 | 2019-02-21 10:29:39 | 0:19:59 | 0:03:11 | 0:16:48 | ovh | master | ubuntu | 16.04 | smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} | 4 | |
Failure Reason:
{'ovh039.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh039', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh091.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh091', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh070.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh070', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh044.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh044', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |
||||||||||||||
fail | 3621373 | 2019-02-21 07:01:08 | 2019-02-21 10:29:35 | 2019-02-21 11:51:35 | 1:22:00 | 0:27:17 | 0:54:43 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} | 3 | |
Failure Reason:
Command failed (workunit test cls/test_cls_lock.sh) on ovh028 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=6c433736878fc074624cb1795958ef6f3065786e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cls/test_cls_lock.sh' |
||||||||||||||
pass | 3621374 | 2019-02-21 07:01:09 | 2019-02-21 10:29:41 | 2019-02-21 11:53:41 | 1:24:00 | 0:38:56 | 0:45:04 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} | 3 | |
pass | 3621375 | 2019-02-21 07:01:10 | 2019-02-21 10:37:38 | 2019-02-21 12:05:38 | 1:28:00 | 0:31:20 | 0:56:40 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} | 3 | |
fail | 3621376 | 2019-02-21 07:01:10 | 2019-02-21 10:38:12 | 2019-02-21 10:50:11 | 0:11:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} | 3 | |||
Failure Reason:
Command failed on ovh063 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3621377 | 2019-02-21 07:01:11 | 2019-02-21 10:48:10 | 2019-02-21 12:42:11 | 1:54:01 | 1:04:36 | 0:49:25 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} | 3 | |
fail | 3621378 | 2019-02-21 07:01:12 | 2019-02-21 10:50:15 | 2019-02-21 11:02:14 | 0:11:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} | 3 | |||
Failure Reason:
Command failed on ovh063 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 3621379 | 2019-02-21 07:01:12 | 2019-02-21 10:55:45 | 2019-02-21 11:07:44 | 0:11:59 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} | 3 | |||
Failure Reason:
Command failed on ovh074 with status 127: u'rm -f /tmp/kernel.x86_64.rpm && echo kernel-5.0.0_rc3_ceph_gdaa39f567df3-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/daa39f567df37d21c329f4a51de4ff2ad4bdd567/centos/7/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
pass | 3621380 | 2019-02-21 07:01:13 | 2019-02-21 11:02:16 | 2019-02-21 12:26:17 | 1:24:01 | 0:36:47 | 0:47:14 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} | 3 | |
fail | 3621381 | 2019-02-21 07:01:14 | 2019-02-21 11:07:58 | 2019-02-21 12:25:58 | 1:18:00 | 0:27:26 | 0:50:34 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} | 3 | |
Failure Reason:
Command failed (workunit test suites/iozone.sh) on ovh074 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=6c433736878fc074624cb1795958ef6f3065786e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/iozone.sh' |
||||||||||||||
pass | 3621382 | 2019-02-21 07:01:15 | 2019-02-21 11:32:05 | 2019-02-21 13:30:06 | 1:58:01 | 1:10:23 | 0:47:38 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} | 3 | |
pass | 3621383 | 2019-02-21 07:01:15 | 2019-02-21 11:51:52 | 2019-02-21 14:01:53 | 2:10:01 | 1:11:27 | 0:58:34 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} | 3 | |
fail | 3621384 | 2019-02-21 07:01:16 | 2019-02-21 11:53:54 | 2019-02-21 13:03:54 | 1:10:00 | 0:26:31 | 0:43:29 | ovh | master | rhel | 7.5 | smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} | 3 | |
Failure Reason:
Command failed on ovh044 with status 1: 'cd /home/ubuntu/cephtest/swift && ./bootstrap' |