User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
zack | 2016-02-24 18:50:10 | 2016-02-24 18:51:53 | 2016-02-24 19:41:39 | 0:49:46 | ceph-ansible | master | vps | — | 1 | 11 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 24813 | 2016-02-24 18:50:13 | 2016-02-24 18:51:53 | 2016-02-24 19:37:52 | 0:45:59 | 0:42:48 | 0:03:11 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/3-node.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/from_teuthology.yaml journal/collocated.yaml} 5-tasks/cls.yaml} | 3 | |
Failure Reason:
Command failed with status 2: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": false, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_BUad4P --limit vpm039.front.sepia.ceph.com,vpm149.front.sepia.ceph.com,vpm060.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_2SYZ1d' |
||||||||||||||
fail | 24814 | 2016-02-24 18:50:14 | 2016-02-24 18:52:42 | 2016-02-24 19:36:43 | 0:44:01 | 0:41:46 | 0:02:15 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/3-node.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/from_teuthology.yaml journal/collocated.yaml} 5-tasks/rbd_import_export.yaml} | 3 | |
Failure Reason:
Command failed with status 2: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": false, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_PXdKXP --limit vpm093.front.sepia.ceph.com,vpm002.front.sepia.ceph.com,vpm038.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_wlhQj4' |
||||||||||||||
fail | 24815 | 2016-02-24 18:50:15 | 2016-02-24 18:53:52 | 2016-02-24 19:39:53 | 0:46:01 | 0:42:39 | 0:03:22 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/3-node.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/from_teuthology.yaml journal/collocated.yaml} 5-tasks/rbd_cli_tests.yaml} | 3 | |
Failure Reason:
Command failed with status 2: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": false, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_FpbwvC --limit vpm177.front.sepia.ceph.com,vpm070.front.sepia.ceph.com,vpm044.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_ivSScJ' |
||||||||||||||
fail | 24816 | 2016-02-24 18:50:16 | 2016-02-24 18:54:06 | 2016-02-24 19:12:05 | 0:17:59 | 0:14:02 | 0:03:57 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/single_mon_osd.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/osd_auto_discovery.yaml journal/collocated.yaml} 5-tasks/rbd_cli_tests.yaml} | 1 | |
Failure Reason:
Command failed with status 3: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": true, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_JIR5FU --limit vpm062.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_ZkIRLd' |
||||||||||||||
fail | 24817 | 2016-02-24 18:50:17 | 2016-02-24 18:55:10 | 2016-02-24 19:11:10 | 0:16:00 | 0:14:07 | 0:01:53 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/single_mon_osd.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/osd_auto_discovery.yaml journal/collocated.yaml} 5-tasks/cls.yaml} | 1 | |
Failure Reason:
Command failed with status 3: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": true, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_uEdo_Y --limit vpm087.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_Ncavoo' |
||||||||||||||
fail | 24818 | 2016-02-24 18:50:19 | 2016-02-24 18:56:50 | 2016-02-24 19:10:49 | 0:13:59 | 0:11:53 | 0:02:06 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/single_mon_osd.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/osd_auto_discovery.yaml journal/collocated.yaml} 5-tasks/rbd_import_export.yaml} | 1 | |
Failure Reason:
Command failed with status 3: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": true, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_s9KTRf --limit vpm091.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_jYlZIn' |
||||||||||||||
fail | 24819 | 2016-02-24 18:50:20 | 2016-02-24 18:57:03 | 2016-02-24 19:11:02 | 0:13:59 | 0:12:05 | 0:01:54 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/single_mon_osd.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/from_teuthology.yaml journal/collocated.yaml} 5-tasks/cls.yaml} | 1 | |
Failure Reason:
Command failed (workunit test cls/test_cls_hello.sh) on vpm118 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8ff1a8df1c00af8b913cb43c5c72ea49ced27a93 TESTDIR="/home/ubuntu/cephtest" CEPH_ID="0" PATH=$PATH:/usr/sbin adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/workunit.client.0/cls/test_cls_hello.sh' |
||||||||||||||
fail | 24820 | 2016-02-24 18:50:21 | 2016-02-24 18:58:09 | 2016-02-24 19:14:09 | 0:16:00 | 0:15:00 | 0:01:00 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/single_mon_osd.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/from_teuthology.yaml journal/collocated.yaml} 5-tasks/rbd_import_export.yaml} | 1 | |
Failure Reason:
Command failed (workunit test rbd/import_export.sh) on vpm078 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8ff1a8df1c00af8b913cb43c5c72ea49ced27a93 TESTDIR="/home/ubuntu/cephtest" CEPH_ID="0" PATH=$PATH:/usr/sbin adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/workunit.client.0/rbd/import_export.sh' |
||||||||||||||
pass | 24821 | 2016-02-24 18:50:23 | 2016-02-24 18:58:32 | 2016-02-24 19:16:32 | 0:18:00 | 0:15:48 | 0:02:12 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/single_mon_osd.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/from_teuthology.yaml journal/collocated.yaml} 5-tasks/rbd_cli_tests.yaml} | 1 | |
fail | 24822 | 2016-02-24 18:50:24 | 2016-02-24 18:58:20 | 2016-02-24 19:38:20 | 0:40:00 | 0:38:35 | 0:01:25 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/3-node.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/osd_auto_discovery.yaml journal/collocated.yaml} 5-tasks/rbd_cli_tests.yaml} | 3 | |
Failure Reason:
Command failed with status 3: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": true, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_uoPWya --limit vpm178.front.sepia.ceph.com,vpm041.front.sepia.ceph.com,vpm119.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_xQY4xW' |
||||||||||||||
fail | 24823 | 2016-02-24 18:50:24 | 2016-02-24 18:58:45 | 2016-02-24 19:40:46 | 0:42:01 | 0:38:27 | 0:03:34 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/3-node.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/osd_auto_discovery.yaml journal/collocated.yaml} 5-tasks/cls.yaml} | 3 | |
Failure Reason:
Command failed with status 3: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": true, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_al3Bpa --limit vpm137.front.sepia.ceph.com,vpm005.front.sepia.ceph.com,vpm067.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_mu1Wzd' |
||||||||||||||
fail | 24824 | 2016-02-24 18:50:26 | 2016-02-24 18:59:38 | 2016-02-24 19:41:39 | 0:42:01 | 0:39:05 | 0:02:56 | vps | wip-ceph-ansible | centos | 7.2 | ceph-ansible/smoke/{0-clusters/3-node.yaml 1-distros/centos_7.2.yaml 2-setup/ceph_ansible.yaml 3-common/{os_tuning/vm_friendly.yaml source/upstream_stable.yaml} 4-osd/{devices/osd_auto_discovery.yaml journal/collocated.yaml} 5-tasks/rbd_import_export.yaml} | 3 | |
Failure Reason:
Command failed with status 3: 'ansible-playbook -v --extra-vars \'{"ceph_origin": "upstream", "ansible_ssh_user": "ubuntu", "journal_collocation": true, "osd_auto_discovery": true, "os_tuning_params": "[{\\"name\\": \\"kernel.pid_max\\", \\"value\\": 4194303},{\\"name\\": \\"fs.file-max\\", \\"value\\": 26234859}]", "ceph_stable": true, "journal_size": 1024}\' -i /tmp/teuth_ansible_hosts_x7fPUp --limit vpm089.front.sepia.ceph.com,vpm043.front.sepia.ceph.com,vpm148.front.sepia.ceph.com /var/lib/teuthworker/src/ceph-ansible_wip-teuthology/teuth_ansible_playbook_gBWZ9T' |