Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
fail 1778562 2017-10-27 00:44:34 2017-10-27 00:44:35 2017-10-27 01:16:35 0:32:00 0:17:29 0:14:31 ovh wip-teuth-systemd ubuntu 16.04 smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} 1
Failure Reason:

Command failed on ovh040 with status 32: 'sudo umount /dev/sdb1'

pass 1778563 2017-10-27 00:44:35 2017-10-27 00:44:36 2017-10-27 03:56:41 3:12:05 0:29:06 2:42:59 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_blogbench.yaml} 5
fail 1778564 2017-10-27 00:44:36 2017-10-27 00:44:37 2017-10-27 01:54:38 1:10:01 0:10:54 0:59:07 ovh wip-teuth-systemd centos 7.4 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
Failure Reason:

{'ovh070.front.sepia.ceph.com': {'_ansible_parsed': True, 'failed': True, 'exception': 'Traceback (most recent call last):\n File "/tmp/ansible_84h809/ansible_modlib.zip/ansible/module_utils/basic.py", line 2502, in atomic_move\n os.rename(b_tmp_dest_name, b_dest)\nOSError: [Errno 1] Operation not permitted\n', '_ansible_no_log': False, 'invocation': {'module_args': {'directory_mode': None, 'force': None, 'remote_src': None, 'dest': '/etc/resolv.conf', 'selevel': None, 'backrefs': False, 'owner': None, 'state': 'present', 'follow': False, 'insertafter': None, 'path': '/etc/resolv.conf', 'validate': None, 'line': 'domain front.sepia.ceph.com', 'src': None, 'group': None, 'insertbefore': None, 'unsafe_writes': None, 'delimiter': None, 'create': False, 'seuser': None, 'serole': None, 'regexp': '^domain .*', 'content': None, 'setype': None, 'mode': None, 'attributes': None, 'backup': False}}, 'changed': False, 'msg': 'Unable to rename file: /tmp/tmp_5prWT to /etc/resolv.conf: [Errno 1] Operation not permitted'}, 'ovh079.front.sepia.ceph.com': {'_ansible_parsed': True, 'failed': True, 'exception': 'Traceback (most recent call last):\n File "/tmp/ansible_rS7JeD/ansible_modlib.zip/ansible/module_utils/basic.py", line 2502, in atomic_move\n os.rename(b_tmp_dest_name, b_dest)\nOSError: [Errno 1] Operation not permitted\n', '_ansible_no_log': False, 'invocation': {'module_args': {'directory_mode': None, 'force': None, 'remote_src': None, 'dest': '/etc/resolv.conf', 'selevel': None, 'backrefs': False, 'owner': None, 'state': 'present', 'follow': False, 'insertafter': None, 'path': '/etc/resolv.conf', 'validate': None, 'line': 'domain front.sepia.ceph.com', 'src': None, 'group': None, 'insertbefore': None, 'unsafe_writes': None, 'delimiter': None, 'create': False, 'seuser': None, 'serole': None, 'regexp': '^domain .*', 'content': None, 'setype': None, 'mode': None, 'attributes': None, 'backup': False}}, 'changed': False, 'msg': 'Unable to rename file: /tmp/tmpeSYSLQ to /etc/resolv.conf: [Errno 1] Operation not permitted'}, 'ovh032.front.sepia.ceph.com': {'_ansible_parsed': True, 'failed': True, 'exception': 'Traceback (most recent call last):\n File "/tmp/ansible_izCjjK/ansible_modlib.zip/ansible/module_utils/basic.py", line 2502, in atomic_move\n os.rename(b_tmp_dest_name, b_dest)\nOSError: [Errno 1] Operation not permitted\n', '_ansible_no_log': False, 'invocation': {'module_args': {'directory_mode': None, 'force': None, 'remote_src': None, 'dest': '/etc/resolv.conf', 'selevel': None, 'backrefs': False, 'owner': None, 'state': 'present', 'follow': False, 'insertafter': None, 'path': '/etc/resolv.conf', 'validate': None, 'line': 'domain front.sepia.ceph.com', 'src': None, 'group': None, 'insertbefore': None, 'unsafe_writes': None, 'delimiter': None, 'create': False, 'seuser': None, 'serole': None, 'regexp': '^domain .*', 'content': None, 'setype': None, 'mode': None, 'attributes': None, 'backup': False}}, 'changed': False, 'msg': 'Unable to rename file: /tmp/tmpCODcO3 to /etc/resolv.conf: [Errno 1] Operation not permitted'}, 'ovh061.front.sepia.ceph.com': {'_ansible_parsed': True, 'failed': True, 'exception': 'Traceback (most recent call last):\n File "/tmp/ansible_P3FTEp/ansible_modlib.zip/ansible/module_utils/basic.py", line 2502, in atomic_move\n os.rename(b_tmp_dest_name, b_dest)\nOSError: [Errno 1] Operation not permitted\n', '_ansible_no_log': False, 'invocation': {'module_args': {'directory_mode': None, 'force': None, 'remote_src': None, 'dest': '/etc/resolv.conf', 'selevel': None, 'backrefs': False, 'owner': None, 'state': 'present', 'follow': False, 'insertafter': None, 'path': '/etc/resolv.conf', 'validate': None, 'line': 'domain front.sepia.ceph.com', 'src': None, 'group': None, 'insertbefore': None, 'unsafe_writes': None, 'delimiter': None, 'create': False, 'seuser': None, 'serole': None, 'regexp': '^domain .*', 'content': None, 'setype': None, 'mode': None, 'attributes': None, 'backup': False}}, 'changed': False, 'msg': 'Unable to rename file: /tmp/tmp4kaSFM to /etc/resolv.conf: [Errno 1] Operation not permitted'}}

pass 1778565 2017-10-27 00:44:36 2017-10-27 00:44:38 2017-10-27 02:18:39 1:34:01 0:28:51 1:05:10 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_fsstress.yaml} 5
dead 1778566 2017-10-27 00:44:37 2017-10-27 00:44:38 2017-10-27 12:47:19 12:02:41 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_iozone.yaml} 5
fail 1778567 2017-10-27 00:44:38 2017-10-27 00:44:39 2017-10-27 02:02:40 1:18:01 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/cfuse_workunit_suites_pjd.yaml} 5
Failure Reason:

Command failed on ovh070 with status 128: 'cd /lib/firmware/updates && sudo git fetch origin && sudo git reset --hard origin/master'

pass 1778568 2017-10-27 00:44:39 2017-10-27 00:44:40 2017-10-27 03:00:43 2:16:03 0:27:12 1:48:51 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_direct_io.yaml} 5
pass 1778569 2017-10-27 00:44:40 2017-10-27 00:44:41 2017-10-27 02:42:44 1:58:03 0:48:48 1:09:15 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_dbench.yaml} 5
pass 1778570 2017-10-27 00:44:41 2017-10-27 00:44:42 2017-10-27 09:23:04 8:38:22 0:27:58 8:10:24 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_fsstress.yaml} 5
pass 1778571 2017-10-27 00:44:41 2017-10-27 00:44:43 2017-10-27 02:34:45 1:50:02 0:24:25 1:25:37 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/kclient_workunit_suites_pjd.yaml} 5
pass 1778572 2017-10-27 00:44:42 2017-10-27 00:44:44 2017-10-27 02:20:45 1:36:01 0:24:25 1:11:36 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/libcephfs_interface_tests.yaml} 5
pass 1778573 2017-10-27 00:44:43 2017-10-27 00:44:45 2017-10-27 02:16:46 1:32:01 0:33:05 0:58:56 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/mon_thrash.yaml} 5
pass 1778574 2017-10-27 00:44:44 2017-10-27 00:44:45 2017-10-27 03:16:49 2:32:04 0:49:13 1:42:51 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_api_tests.yaml} 5
pass 1778575 2017-10-27 00:44:45 2017-10-27 00:44:46 2017-10-27 03:20:54 2:36:08 0:44:13 1:51:55 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_bench.yaml} 5
pass 1778576 2017-10-27 00:44:46 2017-10-27 00:44:47 2017-10-27 03:08:51 2:24:04 0:51:36 1:32:28 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cache_snaps.yaml} 5
pass 1778577 2017-10-27 00:44:47 2017-10-27 00:44:48 2017-10-27 02:24:49 1:40:01 0:47:26 0:52:35 ovh wip-teuth-systemd ubuntu 16.04 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
pass 1778578 2017-10-27 00:44:47 2017-10-27 01:08:33 2017-10-27 03:28:35 2:20:02 0:24:26 1:55:36 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_cls_all.yaml} 5
pass 1778579 2017-10-27 00:44:48 2017-10-27 01:16:57 2017-10-27 02:40:57 1:24:00 0:37:34 0:46:26 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_ec_snaps.yaml} 5
fail 1778580 2017-10-27 00:44:49 2017-10-27 01:22:26 2017-10-27 02:06:26 0:44:00 0:25:35 0:18:25 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_python.yaml} 5
Failure Reason:

Command failed (workunit test rados/test_python.sh) on ovh094 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-qa-systemd TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test_python.sh'

pass 1778581 2017-10-27 00:44:50 2017-10-27 01:28:32 2017-10-27 02:32:32 1:04:00 0:37:18 0:26:42 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rados_workunit_loadgen_mix.yaml} 5
pass 1778582 2017-10-27 00:44:50 2017-10-27 01:28:33 2017-10-27 03:12:34 1:44:01 0:34:28 1:09:33 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_api_tests.yaml} 5
fail 1778583 2017-10-27 00:44:51 2017-10-27 01:28:33 2017-10-27 02:52:33 1:24:00 0:24:57 0:59:03 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_cli_import_export.yaml} 5
Failure Reason:

Command failed (workunit test rbd/import_export.sh) on ovh005 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-qa-systemd TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 RBD_CREATE_ARGS=--new-format adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rbd/import_export.sh'

pass 1778584 2017-10-27 00:44:52 2017-10-27 01:30:39 2017-10-27 13:27:02 11:56:23 0:29:34 11:26:49 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_fsx.yaml} 5
pass 1778585 2017-10-27 00:44:53 2017-10-27 01:34:29 2017-10-27 13:26:50 11:52:21 0:28:36 11:23:45 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_python_api_tests.yaml} 5
dead 1778586 2017-10-27 00:44:53 2017-10-27 01:36:30 2017-10-27 13:39:13 12:02:43 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rbd_workunit_suites_iozone.yaml} 5
fail 1778587 2017-10-27 00:44:54 2017-10-27 01:40:29 2017-10-27 13:26:54 11:46:25 0:26:35 11:19:50 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_ec_s3tests.yaml} 5
Failure Reason:

HTTPConnectionPool(host='ovh032.front.sepia.ceph.com', port=7280): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdb35978150>: Failed to establish a new connection: [Errno 111] Connection refused',))

fail 1778588 2017-10-27 00:44:55 2017-10-27 01:42:27 2017-10-27 02:26:27 0:44:00 0:22:54 0:21:06 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_s3tests.yaml} 5
Failure Reason:

Command failed on ubuntu@ovh038.front.sepia.ceph.com with status 5: 'sudo systemctl start ceph-radosgw@0'

fail 1778589 2017-10-27 00:44:55 2017-10-27 01:42:27 2017-10-27 09:32:42 7:50:15 0:24:50 7:25:25 ovh wip-teuth-systemd smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore.yaml tasks/rgw_swift.yaml} 5
Failure Reason:

HTTPConnectionPool(host='ovh016.front.sepia.ceph.com', port=7280): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7ff8ae9ba550>: Failed to establish a new connection: [Errno 111] Connection refused',))