User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2018-08-08 05:10:02 | 2018-08-09 14:51:41 | 2018-08-09 15:33:41 | 0:42:00 | ceph-disk | mimic | ovh | 35dbe36 | 1 | 2 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pass | 2880730 | 2018-08-08 05:10:15 | 2018-08-09 14:51:41 | 2018-08-09 15:33:41 | 0:42:00 | 0:34:42 | 0:07:18 | ovh | master | centos | 7.4 | ceph-disk/basic/{distros/centos_latest.yaml tasks/ceph-disk.yaml} | 2 | |
fail | 2880731 | 2018-08-08 05:10:16 | 2018-08-09 14:53:47 | 2018-08-09 15:07:47 | 0:14:00 | 0:08:55 | 0:05:05 | ovh | master | rhel | 7.5 | ceph-disk/basic/{distros/rhel_latest.yaml tasks/ceph-disk.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-disk/ceph-disk.sh) on ovh100 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=mimic TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-disk/ceph-disk.sh' |
||||||||||||||
fail | 2880732 | 2018-08-08 05:10:16 | 2018-08-09 14:53:50 | 2018-08-09 15:19:50 | 0:26:00 | 0:02:24 | 0:23:36 | ovh | master | ubuntu | 16.04 | ceph-disk/basic/{distros/ubuntu_latest.yaml tasks/ceph-disk.yaml} | 2 | |
Failure Reason:
{'ovh003.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh003', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}, 'ovh085.front.sepia.ceph.com': {'_ansible_parsed': True, 'invocation': {'module_args': {'comment': None, 'ssh_key_bits': 0, 'update_password': 'always', 'non_unique': False, 'force': False, 'skeleton': None, 'expires': None, 'ssh_key_passphrase': None, 'groups': ['fuse', 'kvm', 'disk'], 'createhome': True, 'home': None, 'move_home': False, 'password': None, 'generate_ssh_key': None, 'append': True, 'uid': None, 'ssh_key_comment': 'ansible-generated on ovh085', 'group': None, 'name': 'ubuntu', 'local': None, 'seuser': None, 'system': False, 'remove': False, 'state': 'present', 'ssh_key_file': None, 'login_class': None, 'shell': None, 'ssh_key_type': 'rsa'}}, 'changed': False, '_ansible_no_log': False, 'msg': 'Group kvm does not exist'}} |