User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Fail |
---|---|---|---|---|---|---|---|---|---|
teuthology | 2015-06-12 00:18:05 | 2015-06-12 02:25:35 | 2015-06-12 05:25:01 | 2:59:26 | upgrade:firefly-x | next | vps | — | 63 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 929656 | 2015-06-12 00:18:36 | 2015-06-12 02:04:14 | 2015-06-12 04:46:29 | 2:42:15 | 1:17:15 | 1:25:00 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm080 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 929657 | 2015-06-12 00:18:36 | 2015-06-12 02:04:43 | 2015-06-12 03:20:35 | 1:15:52 | 0:11:50 | 1:04:02 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm059 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929658 | 2015-06-12 00:18:37 | 2015-06-12 02:04:46 | 2015-06-12 03:52:55 | 1:48:09 | 0:08:40 | 1:39:29 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929659 | 2015-06-12 00:18:37 | 2015-06-12 02:07:20 | 2015-06-12 04:05:27 | 1:58:07 | 0:10:40 | 1:47:27 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm156 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929660 | 2015-06-12 00:18:38 | 2015-06-12 02:07:30 | 2015-06-12 02:25:31 | 0:18:01 | 0:06:19 | 0:11:42 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm139 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929661 | 2015-06-12 00:18:38 | 2015-06-12 02:09:19 | 2015-06-12 03:25:26 | 1:16:07 | 0:08:55 | 1:07:12 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm075 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929662 | 2015-06-12 00:18:39 | 2015-06-12 02:10:41 | 2015-06-12 03:26:48 | 1:16:07 | 0:04:59 | 1:11:08 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929663 | 2015-06-12 00:18:39 | 2015-06-12 02:14:25 | 2015-06-12 03:12:30 | 0:58:05 | 0:13:25 | 0:44:40 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm075 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929664 | 2015-06-12 00:18:40 | 2015-06-12 02:14:41 | 2015-06-12 04:06:52 | 1:52:11 | 0:14:34 | 1:37:37 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm061 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929665 | 2015-06-12 00:18:40 | 2015-06-12 02:15:36 | 2015-06-12 04:05:46 | 1:50:10 | 1:15:51 | 0:34:19 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm044 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 929666 | 2015-06-12 00:18:41 | 2015-06-12 02:18:31 | 2015-06-12 03:44:39 | 1:26:08 | 0:18:49 | 1:07:19 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm047 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929667 | 2015-06-12 00:18:41 | 2015-06-12 02:21:21 | 2015-06-12 04:09:32 | 1:48:11 | 0:10:10 | 1:38:01 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929668 | 2015-06-12 00:18:41 | 2015-06-12 02:25:35 | 2015-06-12 02:59:38 | 0:34:03 | 0:10:02 | 0:24:01 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm015 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929669 | 2015-06-12 00:18:42 | 2015-06-12 02:29:08 | 2015-06-12 04:17:18 | 1:48:10 | 0:06:45 | 1:41:25 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm014 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929670 | 2015-06-12 00:18:43 | 2015-06-12 02:29:22 | 2015-06-12 03:49:30 | 1:20:08 | 0:17:29 | 1:02:39 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm119 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929671 | 2015-06-12 00:18:43 | 2015-06-12 02:31:21 | 2015-06-12 03:09:24 | 0:38:03 | 0:13:29 | 0:24:34 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929672 | 2015-06-12 00:18:43 | 2015-06-12 02:33:21 | 2015-06-12 03:09:24 | 0:36:03 | 0:07:12 | 0:28:51 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm158 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929673 | 2015-06-12 00:18:44 | 2015-06-12 02:37:26 | 2015-06-12 03:59:33 | 1:22:07 | 0:10:53 | 1:11:14 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm058 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929674 | 2015-06-12 00:18:44 | 2015-06-12 02:41:02 | 2015-06-12 05:05:15 | 2:24:13 | 1:20:39 | 1:03:34 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm159 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 929675 | 2015-06-12 00:18:45 | 2015-06-12 02:46:09 | 2015-06-12 03:28:13 | 0:42:04 | 0:10:16 | 0:31:48 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm189 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929676 | 2015-06-12 00:18:45 | 2015-06-12 02:46:38 | 2015-06-12 03:20:41 | 0:34:03 | 0:08:05 | 0:25:58 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929677 | 2015-06-12 00:18:46 | 2015-06-12 02:48:44 | 2015-06-12 04:30:53 | 1:42:09 | 0:07:32 | 1:34:37 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm047 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929678 | 2015-06-12 00:18:46 | 2015-06-12 02:49:13 | 2015-06-12 03:37:17 | 0:48:04 | 0:14:59 | 0:33:05 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm058 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929679 | 2015-06-12 00:18:47 | 2015-06-12 02:53:27 | 2015-06-12 03:37:30 | 0:44:03 | 0:08:12 | 0:35:51 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm061 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929680 | 2015-06-12 00:18:47 | 2015-06-12 02:55:56 | 2015-06-12 04:30:05 | 1:34:09 | 0:07:53 | 1:26:16 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929681 | 2015-06-12 00:18:48 | 2015-06-12 02:57:12 | 2015-06-12 03:47:15 | 0:50:03 | 0:14:30 | 0:35:33 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm072 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929682 | 2015-06-12 00:18:48 | 2015-06-12 02:58:21 | 2015-06-12 03:26:23 | 0:28:02 | 0:10:50 | 0:17:12 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm164 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929683 | 2015-06-12 00:18:49 | 2015-06-12 02:59:00 | 2015-06-12 05:15:13 | 2:16:13 | 1:16:01 | 1:00:12 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm119 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 929684 | 2015-06-12 00:18:49 | 2015-06-12 02:59:26 | 2015-06-12 04:29:34 | 1:30:08 | 0:08:22 | 1:21:46 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm061 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929685 | 2015-06-12 00:18:50 | 2015-06-12 02:59:42 | 2015-06-12 04:17:49 | 1:18:07 | 0:10:03 | 1:08:04 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929686 | 2015-06-12 00:18:50 | 2015-06-12 03:00:55 | 2015-06-12 03:32:58 | 0:32:03 | 0:08:15 | 0:23:48 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm069 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929687 | 2015-06-12 00:18:51 | 2015-06-12 03:04:58 | 2015-06-12 03:29:00 | 0:24:02 | 0:07:28 | 0:16:34 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm066 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929688 | 2015-06-12 00:18:51 | 2015-06-12 03:06:48 | 2015-06-12 04:40:57 | 1:34:09 | 0:11:57 | 1:22:12 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm142 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929689 | 2015-06-12 00:18:52 | 2015-06-12 03:09:28 | 2015-06-12 03:57:32 | 0:48:04 | 0:09:45 | 0:38:19 | vps | master | centos | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/centos_6.5.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929690 | 2015-06-12 00:18:52 | 2015-06-12 03:09:30 | 2015-06-12 04:45:39 | 1:36:09 | 0:10:58 | 1:25:11 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm014 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929691 | 2015-06-12 00:18:53 | 2015-06-12 03:12:34 | 2015-06-12 03:38:35 | 0:26:01 | 0:06:16 | 0:19:45 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm097 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929692 | 2015-06-12 00:18:53 | 2015-06-12 03:12:49 | 2015-06-12 05:25:01 | 2:12:12 | 1:12:57 | 0:59:15 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm038 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 929693 | 2015-06-12 00:18:54 | 2015-06-12 03:13:21 | 2015-06-12 04:23:27 | 1:10:06 | 0:06:42 | 1:03:24 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm138 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929694 | 2015-06-12 00:18:54 | 2015-06-12 03:16:33 | 2015-06-12 03:34:33 | 0:18:00 | 0:05:32 | 0:12:28 | vps | master | debian | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/debian_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929695 | 2015-06-12 00:18:54 | 2015-06-12 03:17:28 | 2015-06-12 03:43:31 | 0:26:03 | 0:14:09 | 0:11:54 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm102 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929696 | 2015-06-12 00:18:55 | 2015-06-12 03:18:44 | 2015-06-12 04:50:52 | 1:32:08 | 0:14:17 | 1:17:51 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm154 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929697 | 2015-06-12 00:18:55 | 2015-06-12 03:19:06 | 2015-06-12 04:53:15 | 1:34:09 | 0:17:42 | 1:16:27 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm071 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929698 | 2015-06-12 00:18:56 | 2015-06-12 03:20:39 | 2015-06-12 04:10:43 | 0:50:04 | 0:11:15 | 0:38:49 | vps | master | rhel | 6.4 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929699 | 2015-06-12 00:18:56 | 2015-06-12 03:20:44 | 2015-06-12 03:50:47 | 0:30:03 | 0:10:33 | 0:19:30 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm162 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929700 | 2015-06-12 00:18:57 | 2015-06-12 03:21:32 | 2015-06-12 04:17:37 | 0:56:05 | 0:09:21 | 0:46:44 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm050 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929701 | 2015-06-12 00:18:57 | 2015-06-12 03:22:47 | 2015-06-12 04:48:55 | 1:26:08 | 1:09:21 | 0:16:47 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm183 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 929702 | 2015-06-12 00:18:58 | 2015-06-12 03:23:54 | 2015-06-12 03:59:57 | 0:36:03 | 0:17:39 | 0:18:24 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm191 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929703 | 2015-06-12 00:18:58 | 2015-06-12 03:25:29 | 2015-06-12 04:13:33 | 0:48:04 | 0:15:16 | 0:32:48 | vps | master | rhel | 6.5 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929704 | 2015-06-12 00:18:59 | 2015-06-12 03:26:27 | 2015-06-12 04:42:33 | 1:16:06 | 0:08:14 | 1:07:52 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm162 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929705 | 2015-06-12 00:18:59 | 2015-06-12 03:26:51 | 2015-06-12 04:16:56 | 0:50:05 | 0:11:48 | 0:38:17 | vps | master | centos | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/centos_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm162 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929706 | 2015-06-12 00:19:00 | 2015-06-12 03:28:16 | 2015-06-12 04:44:23 | 1:16:07 | 0:12:33 | 1:03:34 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm149 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929707 | 2015-06-12 00:19:00 | 2015-06-12 03:29:03 | 2015-06-12 04:13:07 | 0:44:04 | 0:07:45 | 0:36:19 | vps | master | rhel | 7.0 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929708 | 2015-06-12 00:19:01 | 2015-06-12 03:31:49 | 2015-06-12 04:39:55 | 1:08:06 | 0:06:21 | 1:01:45 | vps | master | debian | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/debian_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm061 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929709 | 2015-06-12 00:19:01 | 2015-06-12 03:32:12 | 2015-06-12 04:18:16 | 0:46:04 | 0:13:55 | 0:32:09 | vps | master | rhel | 6.4 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_6.4.yaml} | 3 | |
Failure Reason:
Command failed on vpm129 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929710 | 2015-06-12 00:19:02 | 2015-06-12 03:33:01 | 2015-06-12 05:13:10 | 1:40:09 | 1:16:30 | 0:23:39 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-mon/mona.yaml 5-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 6-next-mon/monb.yaml 7-workload/{radosbench.yaml rbd_api.yaml} 8-next-mon/monc.yaml 9-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm012 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op read 100 --op write 50 --op write_excl 50 --op delete 50 --pool unique_pool_3' |
||||||||||||||
fail | 929711 | 2015-06-12 00:19:02 | 2015-06-12 03:34:27 | 2015-06-12 04:32:32 | 0:58:05 | 0:08:35 | 0:49:30 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm097 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929712 | 2015-06-12 00:19:03 | 2015-06-12 03:34:37 | 2015-06-12 04:26:41 | 0:52:04 | 0:05:08 | 0:46:56 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-all.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929713 | 2015-06-12 00:19:03 | 2015-06-12 03:36:42 | 2015-06-12 04:54:48 | 1:18:06 | 0:15:27 | 1:02:39 | vps | master | rhel | 6.5 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/rhel_6.5.yaml} | 3 | |
Failure Reason:
Command failed on vpm177 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929714 | 2015-06-12 00:19:04 | 2015-06-12 03:37:20 | 2015-06-12 04:07:23 | 0:30:03 | 0:12:04 | 0:17:59 | vps | master | rhel | 7.0 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/rhel_7.0.yaml} | 3 | |
Failure Reason:
Command failed on vpm097 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929715 | 2015-06-12 00:19:04 | 2015-06-12 03:37:34 | 2015-06-12 04:27:38 | 0:50:04 | 0:07:01 | 0:43:03 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/{ec-rados-parallel.yaml rados_api.yaml rados_loadgenbig.yaml test_rbd_api.yaml test_rbd_python.yaml} 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm191 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0' |
||||||||||||||
fail | 929716 | 2015-06-12 00:19:05 | 2015-06-12 03:38:39 | 2015-06-12 04:10:41 | 0:32:02 | 0:06:04 | 0:25:58 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/parallel-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 3-upgrade-sequence/upgrade-mon-osd-mds.yaml 4-final-workload/{rados-snaps-few-objects.yaml rados_loadgenmix.yaml rados_mon_thrash.yaml rbd_cls.yaml rbd_import_export.yaml rgw_swift.yaml} distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
need more than 0 values to unpack |
||||||||||||||
fail | 929717 | 2015-06-12 00:19:05 | 2015-06-12 03:42:01 | 2015-06-12 04:38:06 | 0:56:05 | 0:06:42 | 0:49:23 | vps | master | ubuntu | 12.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-rados-default.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-no-lrc.yaml distros/ubuntu_12.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm079 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |
||||||||||||||
fail | 929718 | 2015-06-12 00:19:05 | 2015-06-12 03:42:03 | 2015-06-12 04:40:08 | 0:58:05 | 0:07:03 | 0:51:02 | vps | master | ubuntu | 14.04 | upgrade:firefly-x/stress-split-erasure-code/{0-cluster/start.yaml 1-firefly-install/firefly.yaml 2-workload/ec-cache-create.yaml 3-partial-upgrade/firsthalf.yaml 4-thrash/default.yaml 5-mon/mona.yaml 6-workload/ec-cache-agent.yaml 7-next-mon/monb.yaml 8-next-mon/monc.yaml 9-workload/ec-rados-plugin=jerasure-k=3-m=1.yaml distros/ubuntu_14.04.yaml} | 3 | |
Failure Reason:
Command failed on vpm199 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --max-ops 4000 --objects 500 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op read 100 --op write 50 --op copy_from 50 --op write_excl 50 --op delete 50 --pool ecbase' |