Name Machine Type Up Locked Locked Since Locked By OS Type OS Version Arch Description
ovh070.front.sepia.ceph.com ovh True True 2019-02-21 21:57:43.203848 scheduled_trociny@teuthology ubuntu 16.04 x86_64 /home/teuthworker/archive/trociny-2019-02-21_07:42:54-rbd-wip-mgolub-testing-distro-basic-ovh/3621966
Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version
running 3621966 2019-02-21 07:43:16 2019-02-21 21:57:43 2019-02-21 22:05:43 0:08:33 ovh master
fail 3621955 2019-02-21 07:43:08 2019-02-21 21:26:14 2019-02-21 21:52:13 0:25:59 0:14:41 0:11:18 ovh master
Failure Reason:

Command failed on ovh042 with status 128: 'rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/ceph/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 69454404fc5d1954419663246c470d3326916bc9'

fail 3621946 2019-02-21 07:43:02 2019-02-21 20:58:13 2019-02-21 21:26:12 0:27:59 0:16:16 0:11:43 ovh master
Failure Reason:

Command failed on ovh016 with status 128: 'rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/ceph/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 69454404fc5d1954419663246c470d3326916bc9'

pass 3621714 2019-02-21 07:03:04 2019-02-21 15:38:45 2019-02-21 16:12:44 0:33:59 0:19:35 0:14:24 ovh master centos 7.4
fail 3621384 2019-02-21 07:01:16 2019-02-21 11:53:54 2019-02-21 13:03:54 1:10:00 0:26:31 0:43:29 ovh master rhel 7.5
Failure Reason:

Command failed on ovh044 with status 1: 'cd /home/ubuntu/cephtest/swift && ./bootstrap'

pass 3621374 2019-02-21 07:01:09 2019-02-21 10:29:41 2019-02-21 11:53:41 1:24:00 0:38:56 0:45:04 ovh master rhel 7.5
fail 3621372 2019-02-21 07:01:07 2019-02-21 10:09:40 2019-02-21 10:29:39 0:19:59 0:03:11 0:16:48 ovh master ubuntu 16.04
Failure Reason:

{'ovh039.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh039', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh091.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh091', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh070.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh070', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh044.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh044', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}}

pass 3621360 2019-02-21 07:00:59 2019-02-21 08:41:27 2019-02-21 10:09:27 1:28:00 0:39:31 0:48:29 ovh master rhel 7.5
pass 3620544 2019-02-21 05:04:38 2019-02-21 07:33:08 2019-02-21 08:41:08 1:08:00 0:57:04 0:10:56 ovh master
pass 3620528 2019-02-21 05:04:27 2019-02-21 05:50:54 2019-02-21 07:32:55 1:42:01 1:30:40 0:11:21 ovh master
fail 3620505 2019-02-21 05:02:36 2019-02-21 16:12:48 2019-02-21 19:46:51 3:34:03 3:23:32 0:10:31 ovh master ubuntu 16.04
Failure Reason:

Command failed (workunit test rados/test.sh) on ovh044 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=6ca980beffafdbbfbfd8273a27a6e19844c37266 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test.sh'

pass 3620497 2019-02-21 05:02:31 2019-02-21 05:02:41 2019-02-21 05:50:41 0:48:00 0:34:55 0:13:05 ovh master ubuntu 16.04
fail 3617521 2019-02-20 13:23:01 2019-02-20 13:59:22 2019-02-20 14:31:21 0:31:59 0:13:43 0:18:16 ovh master ubuntu 16.04
Failure Reason:

Command failed on ovh016 with status 2: 'cd ~/ceph-ansible ; virtualenv --system-site-packages venv ; source venv/bin/activate ; pip install --upgrade pip ; pip install setuptools>=11.3 notario>=0.0.13 netaddr ansible==2.5 ; ANSIBLE_STDOUT_CALLBACK=debug ansible-playbook -vv -i inven.yml site.yml'

pass 3617171 2019-02-20 07:03:17 2019-02-20 13:15:22 2019-02-20 13:59:22 0:44:00 0:25:05 0:18:55 ovh master centos 7.4
pass 3617161 2019-02-20 07:03:09 2019-02-20 12:15:13 2019-02-20 13:15:13 1:00:00 0:30:28 0:29:32 ovh master centos 7.4
pass 3617154 2019-02-20 07:03:04 2019-02-20 11:22:58 2019-02-20 12:14:58 0:52:00 0:25:21 0:26:39 ovh master centos 7.4
pass 3617136 2019-02-20 07:01:38 2019-02-20 09:04:49 2019-02-20 11:22:50 2:18:01 1:24:46 0:53:15 ovh master rhel 7.5
fail 3616615 2019-02-20 05:15:44 2019-02-20 18:34:14 2019-02-20 19:12:13 0:37:59 0:16:18 0:21:41 ovh master centos 7.4
Failure Reason:

Command failed on ovh016 with status 2: 'cd ~/ceph-ansible ; virtualenv --system-site-packages venv ; source venv/bin/activate ; pip install --upgrade pip ; pip install setuptools>=11.3 notario>=0.0.13 netaddr ansible==2.5 ; ANSIBLE_STDOUT_CALLBACK=debug ansible-playbook -vv -i inven.yml site.yml'

fail 3616595 2019-02-20 05:15:36 2019-02-20 17:53:57 2019-02-20 18:33:57 0:40:00 0:18:34 0:21:26 ovh master centos 7.4
Failure Reason:

Command failed on ovh083 with status 2: 'cd ~/ceph-ansible ; virtualenv --system-site-packages venv ; source venv/bin/activate ; pip install --upgrade pip ; pip install setuptools>=11.3 notario>=0.0.13 netaddr ansible==2.5 ; ANSIBLE_STDOUT_CALLBACK=debug ansible-playbook -vv -i inven.yml site.yml'

fail 3616580 2019-02-20 05:15:30 2019-02-20 17:21:54 2019-02-20 17:53:54 0:32:00 0:16:23 0:15:37 ovh master centos 7.4
Failure Reason:

Command failed on ovh083 with status 2: 'cd ~/ceph-ansible ; virtualenv --system-site-packages venv ; source venv/bin/activate ; pip install --upgrade pip ; pip install setuptools>=11.3 notario>=0.0.13 netaddr ansible==2.5 ; ANSIBLE_STDOUT_CALLBACK=debug ansible-playbook -vv -i inven.yml site.yml'