User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2018-07-20 17:05:43 | 2018-07-20 17:06:28 | 2018-07-20 17:58:08 | 0:51:40 | ceph-disk | mimic | ovh | bd3b97c | 1 | 2 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pass | 2801845 | 2018-07-20 17:06:01 | 2018-07-20 17:06:08 | 2018-07-20 17:58:08 | 0:52:00 | 0:35:14 | 0:16:46 | ovh | master | centos | 7.4 | ceph-disk/basic/{distros/centos_latest.yaml tasks/ceph-disk.yaml} | 2 | |
fail | 2801846 | 2018-07-20 17:06:01 | 2018-07-20 17:06:16 | 2018-07-20 17:22:15 | 0:15:59 | 0:09:49 | 0:06:10 | ovh | master | rhel | 7.5 | ceph-disk/basic/{distros/rhel_latest.yaml tasks/ceph-disk.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-disk/ceph-disk.sh) on ovh090 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=mimic TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-disk/ceph-disk.sh' |
||||||||||||||
fail | 2801847 | 2018-07-20 17:06:02 | 2018-07-20 17:06:28 | 2018-07-20 17:16:27 | 0:09:59 | 0:02:35 | 0:07:24 | ovh | master | ubuntu | 16.04 | ceph-disk/basic/{distros/ubuntu_latest.yaml tasks/ceph-disk.yaml} | 2 | |
Failure Reason:
{'ovh033.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh033', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh085.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh085', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |