User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|
teuthology | 2015-06-17 00:18:01 | 2015-06-17 07:28:22 | 2015-06-18 20:21:13 | 1 day, 12:52:51 | upgrade:firefly-x | next | vps | — | 50 | 13 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 936500 | 2015-06-17 00:18:07 | 2015-06-17 07:23:14 | 2015-06-17 09:13:23 | 1:50:09 | 0:09:15 | 1:40:54 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/centos_6.5.yaml} | 2 | |
Failure Reason:
list index out of range |
||||||||||||||
fail | 936501 | 2015-06-17 00:18:08 | 2015-06-17 07:26:48 | 2015-06-17 07:56:50 | 0:30:02 | 0:13:55 | 0:16:07 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm177 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936502 | 2015-06-17 00:18:08 | 2015-06-17 07:27:07 | 2015-06-17 08:09:10 | 0:42:03 | 0:09:39 | 0:32:24 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936503 | 2015-06-17 00:18:09 | 2015-06-17 07:28:22 | 2015-06-17 07:52:24 | 0:24:02 | 0:11:21 | 0:12:41 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm051 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936504 | 2015-06-17 00:18:09 | 2015-06-17 07:29:15 | 2015-06-17 08:03:17 | 0:34:02 | 0:06:08 | 0:27:54 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm101 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936505 | 2015-06-17 00:18:10 | 2015-06-17 07:31:22 | 2015-06-17 07:49:23 | 0:18:01 | 0:04:23 | 0:13:38 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm047 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936506 | 2015-06-17 00:18:10 | 2015-06-17 07:31:33 | 2015-06-17 08:03:34 | 0:32:01 | 0:04:52 | 0:27:09 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936507 | 2015-06-17 00:18:11 | 2015-06-17 07:31:39 | 2015-06-17 09:13:47 | 1:42:08 | 0:12:53 | 1:29:15 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.4.yaml} | 2 | |
Failure Reason:
list index out of range |
||||||||||||||
fail | 936508 | 2015-06-17 00:18:11 | 2015-06-17 07:32:39 | 2015-06-17 16:33:22 | 9:00:43 | 0:10:44 | 8:49:59 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm058 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936509 | 2015-06-17 00:18:12 | 2015-06-17 07:33:07 | 2015-06-17 08:57:14 | 1:24:07 | 1:17:31 | 0:06:36 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm068 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 936510 | 2015-06-17 00:18:12 | 2015-06-17 07:35:30 | 2015-06-17 08:31:34 | 0:56:04 | 0:16:45 | 0:39:19 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm191 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936511 | 2015-06-17 00:18:13 | 2015-06-17 07:37:34 | 2015-06-17 08:03:35 | 0:26:01 | 0:13:32 | 0:12:29 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
dead | 936512 | 2015-06-17 00:18:13 | 2015-06-17 07:39:12 | 2015-06-17 08:49:17 | 1:10:05 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_7.0.yaml} | — | |||
dead | 936513 | 2015-06-17 00:18:14 | 2015-06-17 07:41:21 | 2015-06-17 08:57:26 | 1:16:05 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_12.04.yaml} | — | |||
fail | 936514 | 2015-06-17 00:18:14 | 2015-06-17 07:44:04 | 2015-06-17 08:18:07 | 0:34:03 | 0:15:01 | 0:19:02 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm106 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936515 | 2015-06-17 00:18:15 | 2015-06-17 07:44:07 | 2015-06-17 08:44:11 | 1:00:04 | 0:14:38 | 0:45:26 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936516 | 2015-06-17 00:18:15 | 2015-06-17 07:49:27 | 2015-06-17 08:17:29 | 0:28:02 | 0:08:32 | 0:19:30 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm014 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936517 | 2015-06-17 00:18:15 | 2015-06-17 07:50:36 | 2015-06-17 08:14:37 | 0:24:01 | 0:10:47 | 0:13:14 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm197 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936518 | 2015-06-17 00:18:16 | 2015-06-17 07:52:25 | 2015-06-17 09:58:35 | 2:06:10 | 1:20:20 | 0:45:50 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm082 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 936519 | 2015-06-17 00:18:17 | 2015-06-17 07:52:27 | 2015-06-17 08:12:29 | 0:20:02 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |||
Failure Reason:
Could not reconnect to ubuntu@vpm168.front.sepia.ceph.com |
||||||||||||||
fail | 936520 | 2015-06-17 00:18:17 | 2015-06-17 07:52:34 | 2015-06-17 08:34:38 | 0:42:04 | 0:09:51 | 0:32:13 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936521 | 2015-06-17 00:18:17 | 2015-06-17 07:53:34 | 2015-06-17 09:01:39 | 1:08:05 | 0:09:25 | 0:58:40 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/debian_7.0.yaml} | 2 | |
Failure Reason:
list index out of range |
||||||||||||||
fail | 936522 | 2015-06-17 00:18:18 | 2015-06-17 07:54:34 | 2015-06-17 08:30:37 | 0:36:03 | 0:14:16 | 0:21:47 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm056 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936523 | 2015-06-17 00:18:18 | 2015-06-17 07:55:15 | 2015-06-17 08:31:17 | 0:36:02 | 0:06:04 | 0:29:58 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm053 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936524 | 2015-06-17 00:18:19 | 2015-06-17 07:56:24 | 2015-06-17 08:26:26 | 0:30:02 | 0:07:39 | 0:22:23 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936525 | 2015-06-17 00:18:19 | 2015-06-17 07:56:54 | 2015-06-17 08:26:55 | 0:30:01 | 0:15:50 | 0:14:11 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm120 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
dead | 936526 | 2015-06-17 00:18:20 | 2015-06-17 07:59:35 | 2015-06-17 08:49:39 | 0:50:04 | 0:27:10 | 0:22:54 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm061 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936527 | 2015-06-17 00:18:20 | 2015-06-17 08:03:21 | 2015-06-17 10:03:30 | 2:00:09 | 1:20:07 | 0:40:02 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm041 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
dead | 936528 | 2015-06-17 00:18:21 | 2015-06-17 08:03:38 | 2015-06-17 08:47:41 | 0:44:03 | 0:17:52 | 0:26:11 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
SSH connection to vpm060 was lost: u'sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=0.80.9-258-g34ba371-1trusty ceph-dbg=0.80.9-258-g34ba371-1trusty ceph-mds=0.80.9-258-g34ba371-1trusty ceph-mds-dbg=0.80.9-258-g34ba371-1trusty ceph-common=0.80.9-258-g34ba371-1trusty ceph-common-dbg=0.80.9-258-g34ba371-1trusty ceph-fuse=0.80.9-258-g34ba371-1trusty ceph-fuse-dbg=0.80.9-258-g34ba371-1trusty ceph-test=0.80.9-258-g34ba371-1trusty ceph-test-dbg=0.80.9-258-g34ba371-1trusty radosgw=0.80.9-258-g34ba371-1trusty radosgw-dbg=0.80.9-258-g34ba371-1trusty python-ceph=0.80.9-258-g34ba371-1trusty libcephfs1=0.80.9-258-g34ba371-1trusty libcephfs1-dbg=0.80.9-258-g34ba371-1trusty libcephfs-java=0.80.9-258-g34ba371-1trusty libcephfs-jni=0.80.9-258-g34ba371-1trusty librados2=0.80.9-258-g34ba371-1trusty librados2-dbg=0.80.9-258-g34ba371-1trusty librbd1=0.80.9-258-g34ba371-1trusty librbd1-dbg=0.80.9-258-g34ba371-1trusty rbd-fuse=0.80.9-258-g34ba371-1trusty librados2=0.80.9-258-g34ba371-1trusty librados2-dbg=0.80.9-258-g34ba371-1trusty librbd1=0.80.9-258-g34ba371-1trusty librbd1-dbg=0.80.9-258-g34ba371-1trusty' |
||||||||||||||
fail | 936529 | 2015-06-17 00:18:21 | 2015-06-17 08:03:41 | 2015-06-17 08:41:44 | 0:38:03 | 0:06:45 | 0:31:18 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936530 | 2015-06-17 00:18:22 | 2015-06-17 08:06:10 | 2015-06-17 08:32:11 | 0:26:01 | 0:08:20 | 0:17:41 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm047 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936531 | 2015-06-17 00:18:22 | 2015-06-17 08:07:22 | 2015-06-17 08:43:25 | 0:36:03 | 0:10:14 | 0:25:49 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm106 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936532 | 2015-06-17 00:18:23 | 2015-06-17 08:09:13 | 2015-06-17 08:47:16 | 0:38:03 | 0:13:34 | 0:24:29 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm014 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936533 | 2015-06-17 00:18:23 | 2015-06-17 08:09:39 | 2015-06-17 08:31:40 | 0:22:01 | 0:09:08 | 0:12:53 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936534 | 2015-06-17 00:18:24 | 2015-06-17 08:12:05 | 2015-06-17 08:54:08 | 0:42:03 | 0:10:41 | 0:31:22 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm164 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
dead | 936535 | 2015-06-17 00:18:24 | 2015-06-17 08:12:32 | 2015-06-17 08:54:35 | 0:42:03 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/debian_7.0.yaml} | — | |||
fail | 936536 | 2015-06-17 00:18:25 | 2015-06-17 08:13:02 | 2015-06-17 10:15:11 | 2:02:09 | 1:14:54 | 0:47:15 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm044 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 936537 | 2015-06-17 00:18:25 | 2015-06-17 08:13:18 | 2015-06-17 09:01:21 | 0:48:03 | 0:14:14 | 0:33:49 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm199 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936538 | 2015-06-17 00:18:25 | 2015-06-17 08:14:41 | 2015-06-17 08:26:42 | 0:12:01 | 0:05:08 | 0:06:53 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936539 | 2015-06-17 00:18:26 | 2015-06-17 08:16:26 | 2015-06-17 08:54:29 | 0:38:03 | 0:13:43 | 0:24:20 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm053 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
dead | 936540 | 2015-06-17 00:18:26 | 2015-06-17 08:16:50 | 2015-06-17 08:50:52 | 0:34:02 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.5.yaml} | — | |||
fail | 936541 | 2015-06-17 00:18:27 | 2015-06-17 08:17:32 | 2015-06-17 09:15:36 | 0:58:04 | 0:14:25 | 0:43:39 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm097 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936542 | 2015-06-17 00:18:27 | 2015-06-17 08:18:09 | 2015-06-17 09:04:12 | 0:46:03 | 0:09:47 | 0:36:16 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
dead | 936543 | 2015-06-17 00:18:28 | 2015-06-17 08:18:11 | 2015-06-18 20:21:13 | 1 day, 12:03:02 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_7.0.yaml} | 3 | |||
fail | 936544 | 2015-06-17 00:18:28 | 2015-06-17 08:19:20 | 2015-06-17 09:01:23 | 0:42:03 | 0:06:35 | 0:35:28 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm104 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936545 | 2015-06-17 00:18:29 | 2015-06-17 08:21:19 | 2015-06-17 10:13:28 | 1:52:09 | 1:30:00 | 0:22:09 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm191 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 936546 | 2015-06-17 00:18:29 | 2015-06-17 08:26:47 | 2015-06-17 09:14:50 | 0:48:03 | 0:10:57 | 0:37:06 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | 2 | |
Failure Reason:
list index out of range |
||||||||||||||
dead | 936547 | 2015-06-17 00:18:30 | 2015-06-17 08:26:47 | 2015-06-17 08:56:49 | 0:30:02 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | — | |||
fail | 936548 | 2015-06-17 00:18:30 | 2015-06-17 08:26:59 | 2015-06-17 09:07:02 | 0:40:03 | 0:07:16 | 0:32:47 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm098 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936549 | 2015-06-17 00:18:31 | 2015-06-17 08:29:08 | 2015-06-17 08:59:10 | 0:30:02 | 0:10:30 | 0:19:32 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm111 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936550 | 2015-06-17 00:18:31 | 2015-06-17 08:29:23 | 2015-06-17 09:03:26 | 0:34:03 | 0:15:44 | 0:18:19 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm047 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 936551 | 2015-06-17 00:18:32 | 2015-06-17 08:30:41 | 2015-06-17 09:02:43 | 0:32:02 | 0:15:05 | 0:16:57 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 936552 | 2015-06-17 00:18:32 | 2015-06-17 08:31:21 | 2015-06-17 09:07:23 | 0:36:02 | 0:06:13 | 0:29:49 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm121 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936553 | 2015-06-17 00:18:33 | 2015-06-17 08:31:39 | 2015-06-17 09:01:41 | 0:30:02 | 0:14:47 | 0:15:15 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm110 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 936554 | 2015-06-17 00:18:33 | 2015-06-17 08:31:44 | 2015-06-17 10:11:52 | 1:40:08 | 1:13:34 | 0:26:34 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm146 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 936555 | 2015-06-17 00:18:34 | 2015-06-17 08:32:15 | 2015-06-17 09:10:17 | 0:38:02 | 0:07:56 | 0:30:06 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm128 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
dead | 936556 | 2015-06-17 00:18:34 | 2015-06-17 08:32:38 | 2015-06-17 08:56:39 | 0:24:01 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | — | |||
fail | 936557 | 2015-06-17 00:18:35 | 2015-06-17 08:32:57 | 2015-06-17 09:17:00 | 0:44:03 | 0:12:14 | 0:31:49 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.5.yaml} | 2 | |
Failure Reason:
list index out of range |
||||||||||||||
dead | 936558 | 2015-06-17 00:18:35 | 2015-06-17 08:34:41 | 2015-06-17 08:50:42 | 0:16:01 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_7.0.yaml} | — | |||
fail | 936559 | 2015-06-17 00:18:36 | 2015-06-17 08:37:02 | 2015-06-17 09:17:05 | 0:40:03 | 0:05:27 | 0:34:36 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | 2 | |
Failure Reason:
list index out of range |
||||||||||||||
dead | 936560 | 2015-06-17 00:18:36 | 2015-06-17 08:41:21 | 2015-06-17 08:53:22 | 0:12:01 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | — | |||
dead | 936561 | 2015-06-17 00:18:37 | 2015-06-17 08:41:48 | 2015-06-17 08:49:48 | 0:08:00 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_12.04.yaml} | — | |||
dead | 936562 | 2015-06-17 00:18:37 | 2015-06-17 08:42:34 | 2015-06-17 09:00:35 | 0:18:01 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_14.04.yaml} | — |