User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|---|
pdonnell | 2019-08-08 23:23:52 | 2019-08-10 11:33:39 | 2019-08-11 06:52:35 | 19:18:56 | multimds | wip-pdonnell-testing-20190808.205354 | smithi | 8f7a35d | 405 | 566 | 9 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pass | 4200225 | 2019-08-08 23:24:16 | 2019-08-10 11:33:39 | 2019-08-10 11:59:38 | 0:25:59 | 0:12:44 | 0:13:15 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200226 | 2019-08-08 23:24:17 | 2019-08-10 11:34:42 | 2019-08-10 12:20:42 | 0:46:00 | 0:17:13 | 0:28:47 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200227 | 2019-08-08 23:24:18 | 2019-08-10 11:34:42 | 2019-08-10 12:52:42 | 1:18:00 | 0:49:23 | 0:28:37 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200228 | 2019-08-08 23:24:18 | 2019-08-10 11:37:46 | 2019-08-10 12:03:45 | 0:25:59 | 0:11:43 | 0:14:16 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi069 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200229 | 2019-08-08 23:24:19 | 2019-08-10 11:39:11 | 2019-08-10 11:57:10 | 0:17:59 | 0:10:54 | 0:07:05 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi164 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200230 | 2019-08-08 23:24:20 | 2019-08-10 11:39:47 | 2019-08-10 12:17:46 | 0:37:59 | 0:11:54 | 0:26:05 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi092 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200231 | 2019-08-08 23:24:21 | 2019-08-10 11:39:52 | 2019-08-10 13:27:53 | 1:48:01 | 1:27:50 | 0:20:11 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200232 | 2019-08-08 23:24:22 | 2019-08-10 11:41:35 | 2019-08-10 12:01:34 | 0:19:59 | 0:07:08 | 0:12:51 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi135 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200233 | 2019-08-08 23:24:23 | 2019-08-10 11:43:39 | 2019-08-10 12:01:38 | 0:17:59 | 0:10:29 | 0:07:30 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi109 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200234 | 2019-08-08 23:24:23 | 2019-08-10 11:45:36 | 2019-08-10 12:03:35 | 0:17:59 | 0:10:33 | 0:07:26 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200235 | 2019-08-08 23:24:24 | 2019-08-10 11:47:00 | 2019-08-10 12:51:00 | 1:04:00 | 0:52:26 | 0:11:34 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
pass | 4200236 | 2019-08-08 23:24:25 | 2019-08-10 11:47:00 | 2019-08-10 12:27:00 | 0:40:00 | 0:26:31 | 0:13:29 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
fail | 4200237 | 2019-08-08 23:24:26 | 2019-08-10 11:48:29 | 2019-08-10 12:18:28 | 0:29:59 | 0:10:45 | 0:19:14 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi124 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200238 | 2019-08-08 23:24:27 | 2019-08-10 11:50:22 | 2019-08-10 12:14:21 | 0:23:59 | 0:12:00 | 0:11:59 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200239 | 2019-08-08 23:24:28 | 2019-08-10 11:50:57 | 2019-08-10 12:12:56 | 0:21:59 | 0:10:38 | 0:11:21 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi131 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200240 | 2019-08-08 23:24:28 | 2019-08-10 11:51:47 | 2019-08-10 12:47:47 | 0:56:00 | 0:44:20 | 0:11:40 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200241 | 2019-08-08 23:24:29 | 2019-08-10 11:52:08 | 2019-08-10 12:42:08 | 0:50:00 | 0:07:03 | 0:42:57 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi131 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200242 | 2019-08-08 23:24:30 | 2019-08-10 11:52:41 | 2019-08-10 12:34:40 | 0:41:59 | 0:11:33 | 0:30:26 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200243 | 2019-08-08 23:24:31 | 2019-08-10 11:53:31 | 2019-08-10 12:27:31 | 0:34:00 | 0:11:46 | 0:22:14 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi155 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200244 | 2019-08-08 23:24:32 | 2019-08-10 11:53:44 | 2019-08-10 12:33:43 | 0:39:59 | 0:22:39 | 0:17:20 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T12:25:29.617695+0000 mon.b (mon.0) 789 : cluster [WRN] Health check failed: 7 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200245 | 2019-08-08 23:24:33 | 2019-08-10 11:54:20 | 2019-08-10 12:26:19 | 0:31:59 | 0:10:43 | 0:21:16 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200246 | 2019-08-08 23:24:33 | 2019-08-10 11:57:12 | 2019-08-10 12:17:11 | 0:19:59 | 0:11:00 | 0:08:59 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi097 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200247 | 2019-08-08 23:24:34 | 2019-08-10 11:57:12 | 2019-08-10 12:29:12 | 0:32:00 | 0:11:22 | 0:20:38 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi136 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200248 | 2019-08-08 23:24:35 | 2019-08-10 11:57:36 | 2019-08-10 12:25:35 | 0:27:59 | 0:15:31 | 0:12:28 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200249 | 2019-08-08 23:24:36 | 2019-08-10 11:59:54 | 2019-08-10 12:23:53 | 0:23:59 | 0:10:58 | 0:13:01 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi052 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200250 | 2019-08-08 23:24:37 | 2019-08-10 11:59:58 | 2019-08-10 12:27:57 | 0:27:59 | 0:17:01 | 0:10:58 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200251 | 2019-08-08 23:24:37 | 2019-08-10 12:01:50 | 2019-08-10 13:35:50 | 1:34:00 | 0:49:50 | 0:44:10 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200252 | 2019-08-08 23:24:38 | 2019-08-10 12:01:50 | 2019-08-10 12:35:49 | 0:33:59 | 0:11:14 | 0:22:45 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi109 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200253 | 2019-08-08 23:24:39 | 2019-08-10 12:02:09 | 2019-08-10 13:16:10 | 1:14:01 | 0:39:56 | 0:34:05 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4200254 | 2019-08-08 23:24:40 | 2019-08-10 12:03:35 | 2019-08-10 12:37:35 | 0:34:00 | 0:17:12 | 0:16:48 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
pass | 4200255 | 2019-08-08 23:24:41 | 2019-08-10 12:03:36 | 2019-08-10 12:41:36 | 0:38:00 | 0:14:03 | 0:23:57 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4200256 | 2019-08-08 23:24:41 | 2019-08-10 12:03:46 | 2019-08-10 12:31:46 | 0:28:00 | 0:11:46 | 0:16:14 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi158 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200257 | 2019-08-08 23:24:42 | 2019-08-10 12:06:20 | 2019-08-10 12:32:19 | 0:25:59 | 0:11:51 | 0:14:08 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200258 | 2019-08-08 23:24:43 | 2019-08-10 12:08:36 | 2019-08-10 12:28:35 | 0:19:59 | 0:06:48 | 0:13:11 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi205 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200259 | 2019-08-08 23:24:44 | 2019-08-10 12:09:27 | 2019-08-10 12:49:27 | 0:40:00 | 0:12:09 | 0:27:51 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200260 | 2019-08-08 23:24:45 | 2019-08-10 12:12:48 | 2019-08-10 12:54:48 | 0:42:00 | 0:11:15 | 0:30:45 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi022 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200261 | 2019-08-08 23:24:46 | 2019-08-10 12:12:48 | 2019-08-10 14:28:49 | 2:16:01 | 1:54:55 | 0:21:06 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200262 | 2019-08-08 23:24:46 | 2019-08-10 12:12:57 | 2019-08-10 12:36:57 | 0:24:00 | 0:12:22 | 0:11:38 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200263 | 2019-08-08 23:24:47 | 2019-08-10 12:14:36 | 2019-08-10 12:34:35 | 0:19:59 | 0:06:50 | 0:13:09 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi133 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200264 | 2019-08-08 23:24:48 | 2019-08-10 12:17:13 | 2019-08-10 12:51:12 | 0:33:59 | 0:13:04 | 0:20:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi136 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200265 | 2019-08-08 23:24:49 | 2019-08-10 12:17:48 | 2019-08-10 12:37:47 | 0:19:59 | 0:11:08 | 0:08:51 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi005 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200266 | 2019-08-08 23:24:50 | 2019-08-10 12:18:42 | 2019-08-10 12:54:41 | 0:35:59 | 0:21:19 | 0:14:40 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4200267 | 2019-08-08 23:24:51 | 2019-08-10 12:18:42 | 2019-08-10 13:14:42 | 0:56:00 | 0:13:44 | 0:42:16 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
pass | 4200268 | 2019-08-08 23:24:52 | 2019-08-10 12:20:57 | 2019-08-10 13:28:57 | 1:08:00 | 0:50:12 | 0:17:48 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4200269 | 2019-08-08 23:24:52 | 2019-08-10 12:24:07 | 2019-08-10 12:52:07 | 0:28:00 | 0:12:28 | 0:15:32 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi066 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200270 | 2019-08-08 23:24:53 | 2019-08-10 12:24:50 | 2019-08-10 14:12:51 | 1:48:01 | 0:16:01 | 1:32:00 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200271 | 2019-08-08 23:24:54 | 2019-08-10 12:25:50 | 2019-08-10 13:09:50 | 0:44:00 | 0:10:58 | 0:33:02 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi058 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200272 | 2019-08-08 23:24:55 | 2019-08-10 12:26:21 | 2019-08-10 13:10:20 | 0:43:59 | 0:27:04 | 0:16:55 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
fail | 4200273 | 2019-08-08 23:24:56 | 2019-08-10 12:27:15 | 2019-08-10 12:45:14 | 0:17:59 | 0:11:13 | 0:06:46 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200274 | 2019-08-08 23:24:57 | 2019-08-10 12:27:32 | 2019-08-10 12:51:31 | 0:23:59 | 0:13:34 | 0:10:25 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200275 | 2019-08-08 23:24:57 | 2019-08-10 12:27:59 | 2019-08-10 13:01:58 | 0:33:59 | 0:16:11 | 0:17:48 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200276 | 2019-08-08 23:24:58 | 2019-08-10 12:28:50 | 2019-08-10 13:40:50 | 1:12:00 | 0:49:25 | 0:22:35 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200277 | 2019-08-08 23:24:59 | 2019-08-10 12:29:13 | 2019-08-10 13:11:13 | 0:42:00 | 0:22:03 | 0:19:57 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
"2019-08-10T13:03:31.599356+0000 mon.a (mon.0) 820 : cluster [WRN] Health check failed: 3 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200278 | 2019-08-08 23:25:00 | 2019-08-10 12:32:02 | 2019-08-10 12:52:01 | 0:19:59 | 0:12:18 | 0:07:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200279 | 2019-08-08 23:25:01 | 2019-08-10 12:32:20 | 2019-08-10 13:08:20 | 0:36:00 | 0:13:41 | 0:22:19 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
pass | 4200280 | 2019-08-08 23:25:01 | 2019-08-10 12:33:58 | 2019-08-10 14:13:58 | 1:40:00 | 0:49:28 | 0:50:32 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
pass | 4200281 | 2019-08-08 23:25:02 | 2019-08-10 12:34:10 | 2019-08-10 13:04:09 | 0:29:59 | 0:13:50 | 0:16:09 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
fail | 4200282 | 2019-08-08 23:25:03 | 2019-08-10 12:34:37 | 2019-08-10 13:22:36 | 0:47:59 | 0:10:25 | 0:37:34 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi184 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200283 | 2019-08-08 23:25:04 | 2019-08-10 12:34:42 | 2019-08-10 12:54:41 | 0:19:59 | 0:11:45 | 0:08:14 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi161 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200284 | 2019-08-08 23:25:05 | 2019-08-10 12:36:04 | 2019-08-10 14:34:05 | 1:58:01 | 0:50:07 | 1:07:54 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200285 | 2019-08-08 23:25:06 | 2019-08-10 12:36:07 | 2019-08-10 13:28:06 | 0:51:59 | 0:06:55 | 0:45:04 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200286 | 2019-08-08 23:25:07 | 2019-08-10 12:36:58 | 2019-08-10 13:34:58 | 0:58:00 | 0:11:32 | 0:46:28 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi087 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200287 | 2019-08-08 23:25:07 | 2019-08-10 12:37:37 | 2019-08-10 13:15:36 | 0:37:59 | 0:12:54 | 0:25:05 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200288 | 2019-08-08 23:25:08 | 2019-08-10 12:37:48 | 2019-08-10 15:01:50 | 2:24:02 | 0:08:51 | 2:15:11 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi205 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200289 | 2019-08-08 23:25:09 | 2019-08-10 12:41:43 | 2019-08-10 13:17:43 | 0:36:00 | 0:24:47 | 0:11:13 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200290 | 2019-08-08 23:25:10 | 2019-08-10 12:41:44 | 2019-08-10 13:11:43 | 0:29:59 | 0:11:50 | 0:18:09 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200291 | 2019-08-08 23:25:11 | 2019-08-10 12:42:09 | 2019-08-10 13:26:08 | 0:43:59 | 0:10:10 | 0:33:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200292 | 2019-08-08 23:25:12 | 2019-08-10 12:44:04 | 2019-08-10 13:06:04 | 0:22:00 | 0:06:49 | 0:15:11 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi057 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200293 | 2019-08-08 23:25:12 | 2019-08-10 12:45:29 | 2019-08-10 13:41:29 | 0:56:00 | 0:24:05 | 0:31:55 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T13:30:58.087371+0000 mon.b (mon.0) 951 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200294 | 2019-08-08 23:25:13 | 2019-08-10 12:48:00 | 2019-08-10 13:58:00 | 1:10:00 | 0:06:53 | 1:03:07 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200295 | 2019-08-08 23:25:14 | 2019-08-10 12:48:00 | 2019-08-10 14:20:00 | 1:32:00 | 0:11:08 | 1:20:52 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi132 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200296 | 2019-08-08 23:25:15 | 2019-08-10 12:49:42 | 2019-08-10 13:07:41 | 0:17:59 | 0:10:38 | 0:07:21 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi143 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200297 | 2019-08-08 23:25:16 | 2019-08-10 12:51:15 | 2019-08-10 13:19:15 | 0:28:00 | 0:15:14 | 0:12:46 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
pass | 4200298 | 2019-08-08 23:25:17 | 2019-08-10 12:51:16 | 2019-08-10 14:07:16 | 1:16:00 | 0:31:12 | 0:44:48 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
pass | 4200299 | 2019-08-08 23:25:18 | 2019-08-10 12:51:33 | 2019-08-10 13:19:32 | 0:27:59 | 0:18:01 | 0:09:58 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200300 | 2019-08-08 23:25:19 | 2019-08-10 12:52:03 | 2019-08-10 13:56:03 | 1:04:00 | 0:51:58 | 0:12:02 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200301 | 2019-08-08 23:25:20 | 2019-08-10 12:52:08 | 2019-08-10 13:10:07 | 0:17:59 | 0:10:54 | 0:07:05 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200302 | 2019-08-08 23:25:20 | 2019-08-10 12:52:58 | 2019-08-10 14:06:58 | 1:14:00 | 0:46:28 | 0:27:32 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4200303 | 2019-08-08 23:25:21 | 2019-08-10 12:54:55 | 2019-08-10 13:24:55 | 0:30:00 | 0:07:00 | 0:23:00 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi145 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200304 | 2019-08-08 23:25:22 | 2019-08-10 12:54:55 | 2019-08-10 13:28:55 | 0:34:00 | 0:07:06 | 0:26:54 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi041 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
dead | 4200305 | 2019-08-08 23:25:23 | 2019-08-10 12:54:55 | 2019-08-10 14:06:55 | 1:12:00 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
Failure Reason:
SSH connection to smithi078 was lost: 'rpm -q kernel --last | head -n 1' |
||||||||||||||
pass | 4200306 | 2019-08-08 23:25:24 | 2019-08-10 12:55:07 | 2019-08-10 13:25:07 | 0:30:00 | 0:11:05 | 0:18:55 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200307 | 2019-08-08 23:25:25 | 2019-08-10 12:56:53 | 2019-08-10 13:22:52 | 0:25:59 | 0:11:21 | 0:14:38 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi109 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200308 | 2019-08-08 23:25:25 | 2019-08-10 12:58:37 | 2019-08-10 13:20:37 | 0:22:00 | 0:10:29 | 0:11:31 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200309 | 2019-08-08 23:25:26 | 2019-08-10 13:02:13 | 2019-08-10 13:38:13 | 0:36:00 | 0:12:18 | 0:23:42 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi161 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200310 | 2019-08-08 23:25:27 | 2019-08-10 13:04:25 | 2019-08-10 15:24:26 | 2:20:01 | 2:00:11 | 0:19:50 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200311 | 2019-08-08 23:25:28 | 2019-08-10 13:06:05 | 2019-08-10 17:16:08 | 4:10:03 | 0:12:39 | 3:57:24 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200312 | 2019-08-08 23:25:29 | 2019-08-10 13:07:57 | 2019-08-10 13:51:56 | 0:43:59 | 0:06:49 | 0:37:10 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi180 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200313 | 2019-08-08 23:25:30 | 2019-08-10 13:08:21 | 2019-08-10 13:34:21 | 0:26:00 | 0:12:19 | 0:13:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi068 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200314 | 2019-08-08 23:25:31 | 2019-08-10 13:10:05 | 2019-08-10 13:38:04 | 0:27:59 | 0:12:19 | 0:15:40 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi158 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
dead | 4200315 | 2019-08-08 23:25:32 | 2019-08-10 13:10:09 | 2019-08-10 14:02:08 | 0:51:59 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | — | |||||
Failure Reason:
'ssh_keyscan smithi153.front.sepia.ceph.com' reached maximum tries (5) after waiting for 5 seconds |
||||||||||||||
pass | 4200316 | 2019-08-08 23:25:33 | 2019-08-10 13:10:22 | 2019-08-10 14:02:21 | 0:51:59 | 0:14:50 | 0:37:09 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
pass | 4200317 | 2019-08-08 23:25:34 | 2019-08-10 13:11:28 | 2019-08-10 15:11:29 | 2:00:01 | 0:56:48 | 1:03:13 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4200318 | 2019-08-08 23:25:35 | 2019-08-10 13:11:44 | 2019-08-10 13:31:43 | 0:19:59 | 0:11:07 | 0:08:52 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200319 | 2019-08-08 23:25:36 | 2019-08-10 13:12:49 | 2019-08-10 14:10:49 | 0:58:00 | 0:17:39 | 0:40:21 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200320 | 2019-08-08 23:25:37 | 2019-08-10 13:14:57 | 2019-08-10 13:42:56 | 0:27:59 | 0:11:24 | 0:16:35 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi058 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200321 | 2019-08-08 23:25:37 | 2019-08-10 13:14:57 | 2019-08-10 13:42:56 | 0:27:59 | 0:06:50 | 0:21:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200322 | 2019-08-08 23:25:38 | 2019-08-10 13:15:37 | 2019-08-10 13:53:37 | 0:38:00 | 0:10:09 | 0:27:51 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200323 | 2019-08-08 23:25:39 | 2019-08-10 13:16:24 | 2019-08-10 13:42:23 | 0:25:59 | 0:12:47 | 0:13:12 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200324 | 2019-08-08 23:25:40 | 2019-08-10 13:17:46 | 2019-08-10 13:51:45 | 0:33:59 | 0:15:53 | 0:18:06 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200325 | 2019-08-08 23:25:41 | 2019-08-10 13:17:46 | 2019-08-10 14:49:46 | 1:32:00 | 0:52:12 | 0:39:48 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200326 | 2019-08-08 23:25:42 | 2019-08-10 13:19:29 | 2019-08-10 13:59:29 | 0:40:00 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
Command failed on smithi078 with status 1: 'sudo package-cleanup -y --oldkernels' |
||||||||||||||
fail | 4200327 | 2019-08-08 23:25:43 | 2019-08-10 13:19:33 | 2019-08-10 13:41:32 | 0:21:59 | 0:11:52 | 0:10:07 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi124 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200328 | 2019-08-08 23:25:44 | 2019-08-10 13:20:55 | 2019-08-10 13:38:54 | 0:17:59 | 0:06:49 | 0:11:10 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200329 | 2019-08-08 23:25:44 | 2019-08-10 13:22:48 | 2019-08-10 15:10:49 | 1:48:01 | 1:23:30 | 0:24:31 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200330 | 2019-08-08 23:25:45 | 2019-08-10 13:22:54 | 2019-08-10 13:48:53 | 0:25:59 | 0:06:58 | 0:19:01 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200331 | 2019-08-08 23:25:46 | 2019-08-10 13:25:12 | 2019-08-10 13:45:11 | 0:19:59 | 0:11:30 | 0:08:29 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi063 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200332 | 2019-08-08 23:25:47 | 2019-08-10 13:25:12 | 2019-08-10 13:47:11 | 0:21:59 | 0:11:18 | 0:10:41 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
dead | 4200333 | 2019-08-08 23:25:48 | 2019-08-10 13:26:10 | 2019-08-10 14:02:10 | 0:36:00 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||||
Failure Reason:
SSH connection to smithi078 was lost: 'uname -r' |
||||||||||||||
fail | 4200334 | 2019-08-08 23:25:49 | 2019-08-10 13:28:07 | 2019-08-10 15:12:08 | 1:44:01 | 0:07:09 | 1:36:52 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi094 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200335 | 2019-08-08 23:25:50 | 2019-08-10 13:28:08 | 2019-08-10 14:02:08 | 0:34:00 | 0:10:16 | 0:23:44 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200336 | 2019-08-08 23:25:51 | 2019-08-10 13:28:56 | 2019-08-10 14:00:56 | 0:32:00 | 0:12:47 | 0:19:13 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200337 | 2019-08-08 23:25:51 | 2019-08-10 13:28:58 | 2019-08-10 14:12:58 | 0:44:00 | 0:07:01 | 0:36:59 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi089 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200338 | 2019-08-08 23:25:52 | 2019-08-10 13:31:58 | 2019-08-10 14:17:58 | 0:46:00 | 0:26:09 | 0:19:51 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200339 | 2019-08-08 23:25:53 | 2019-08-10 13:34:37 | 2019-08-10 14:02:36 | 0:27:59 | 0:11:08 | 0:16:51 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200340 | 2019-08-08 23:25:54 | 2019-08-10 13:34:59 | 2019-08-10 14:06:59 | 0:32:00 | 0:10:20 | 0:21:40 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi177 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200341 | 2019-08-08 23:25:55 | 2019-08-10 13:36:00 | 2019-08-10 14:24:00 | 0:48:00 | 0:14:30 | 0:33:30 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200342 | 2019-08-08 23:25:56 | 2019-08-10 13:38:21 | 2019-08-10 14:40:21 | 1:02:00 | 0:22:48 | 0:39:12 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T14:30:14.562330+0000 mon.b (mon.0) 750 : cluster [WRN] Health check failed: 1 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200343 | 2019-08-08 23:25:57 | 2019-08-10 13:38:21 | 2019-08-10 14:28:20 | 0:49:59 | 0:11:17 | 0:38:42 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
dead | 4200344 | 2019-08-08 23:25:58 | 2019-08-10 13:38:55 | 2019-08-10 13:54:54 | 0:15:59 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | — | |||
Failure Reason:
reached maximum tries (60) after waiting for 900 seconds |
||||||||||||||
fail | 4200345 | 2019-08-08 23:25:59 | 2019-08-10 13:41:04 | 2019-08-10 14:11:04 | 0:30:00 | 0:10:14 | 0:19:46 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi166 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200346 | 2019-08-08 23:25:59 | 2019-08-10 13:41:12 | 2019-08-10 14:19:11 | 0:37:59 | 0:16:52 | 0:21:07 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200347 | 2019-08-08 23:26:00 | 2019-08-10 13:41:29 | 2019-08-10 14:47:29 | 1:06:00 | 0:15:08 | 0:50:52 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi019 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200348 | 2019-08-08 23:26:01 | 2019-08-10 13:41:30 | 2019-08-10 14:25:30 | 0:44:00 | 0:23:38 | 0:20:22 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200349 | 2019-08-08 23:26:02 | 2019-08-10 13:41:34 | 2019-08-10 14:07:33 | 0:25:59 | 0:15:51 | 0:10:08 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200350 | 2019-08-08 23:26:03 | 2019-08-10 13:42:39 | 2019-08-10 15:24:39 | 1:42:00 | 0:10:57 | 1:31:03 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi089 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200351 | 2019-08-08 23:26:04 | 2019-08-10 13:42:57 | 2019-08-10 14:26:57 | 0:44:00 | 0:33:06 | 0:10:54 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4200352 | 2019-08-08 23:26:05 | 2019-08-10 13:42:57 | 2019-08-10 14:08:57 | 0:26:00 | 0:16:32 | 0:09:28 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200353 | 2019-08-08 23:26:06 | 2019-08-10 13:44:59 | 2019-08-10 14:08:58 | 0:23:59 | 0:07:04 | 0:16:55 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200354 | 2019-08-08 23:26:06 | 2019-08-10 13:45:13 | 2019-08-10 14:19:12 | 0:33:59 | 0:11:10 | 0:22:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200355 | 2019-08-08 23:26:07 | 2019-08-10 13:47:27 | 2019-08-10 14:33:26 | 0:45:59 | 0:11:53 | 0:34:06 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200356 | 2019-08-08 23:26:08 | 2019-08-10 13:49:09 | 2019-08-10 14:13:08 | 0:23:59 | 0:06:53 | 0:17:06 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi141 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200357 | 2019-08-08 23:26:09 | 2019-08-10 13:51:31 | 2019-08-10 14:17:30 | 0:25:59 | 0:11:00 | 0:14:59 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi163 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200358 | 2019-08-08 23:26:10 | 2019-08-10 13:51:46 | 2019-08-10 14:27:46 | 0:36:00 | 0:10:33 | 0:25:27 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi171 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200359 | 2019-08-08 23:26:10 | 2019-08-10 13:51:58 | 2019-08-10 16:13:59 | 2:22:01 | 1:53:08 | 0:28:53 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200360 | 2019-08-08 23:26:11 | 2019-08-10 13:53:30 | 2019-08-10 16:49:32 | 2:56:02 | 0:11:36 | 2:44:26 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200361 | 2019-08-08 23:26:12 | 2019-08-10 13:53:38 | 2019-08-10 14:25:37 | 0:31:59 | 0:06:52 | 0:25:07 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200362 | 2019-08-08 23:26:13 | 2019-08-10 13:55:09 | 2019-08-10 14:35:08 | 0:39:59 | 0:06:50 | 0:33:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi087 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200363 | 2019-08-08 23:26:14 | 2019-08-10 13:55:39 | 2019-08-10 14:07:38 | 0:11:59 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
Failure Reason:
Command failed on smithi078 with status 1: 'sudo package-cleanup -y --oldkernels' |
||||||||||||||
pass | 4200364 | 2019-08-08 23:26:15 | 2019-08-10 13:56:04 | 2019-08-10 14:42:04 | 0:46:00 | 0:21:35 | 0:24:25 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4200365 | 2019-08-08 23:26:16 | 2019-08-10 13:56:47 | 2019-08-10 14:46:47 | 0:50:00 | 0:06:53 | 0:43:07 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi146 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200366 | 2019-08-08 23:26:16 | 2019-08-10 13:58:13 | 2019-08-10 14:26:13 | 0:28:00 | 0:07:01 | 0:20:59 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi005 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200367 | 2019-08-08 23:26:17 | 2019-08-10 13:59:43 | 2019-08-10 14:23:42 | 0:23:59 | 0:11:08 | 0:12:51 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200368 | 2019-08-08 23:26:18 | 2019-08-10 14:01:10 | 2019-08-10 16:19:12 | 2:18:02 | 0:17:26 | 2:00:36 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200369 | 2019-08-08 23:26:19 | 2019-08-10 14:02:10 | 2019-08-10 14:36:09 | 0:33:59 | 0:10:33 | 0:23:26 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200370 | 2019-08-08 23:26:20 | 2019-08-10 14:02:10 | 2019-08-10 15:08:10 | 1:06:00 | 0:12:43 | 0:53:17 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi120 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200371 | 2019-08-08 23:26:20 | 2019-08-10 14:02:25 | 2019-08-10 14:52:25 | 0:50:00 | 0:14:45 | 0:35:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi166 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200372 | 2019-08-08 23:26:21 | 2019-08-10 14:02:25 | 2019-08-10 15:06:25 | 1:04:00 | 0:12:43 | 0:51:17 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200373 | 2019-08-08 23:26:22 | 2019-08-10 14:02:37 | 2019-08-10 15:02:37 | 1:00:00 | 0:23:45 | 0:36:15 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200374 | 2019-08-08 23:26:23 | 2019-08-10 14:07:11 | 2019-08-10 14:53:10 | 0:45:59 | 0:15:21 | 0:30:38 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200375 | 2019-08-08 23:26:24 | 2019-08-10 14:07:11 | 2019-08-10 14:57:10 | 0:49:59 | 0:12:52 | 0:37:07 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200376 | 2019-08-08 23:26:25 | 2019-08-10 14:07:11 | 2019-08-10 15:31:11 | 1:24:00 | 0:10:38 | 1:13:22 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi152 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200377 | 2019-08-08 23:26:25 | 2019-08-10 14:07:17 | 2019-08-10 14:41:17 | 0:34:00 | 0:13:02 | 0:20:58 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi086 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200378 | 2019-08-08 23:26:26 | 2019-08-10 14:07:35 | 2019-08-10 15:13:35 | 1:06:00 | 0:51:25 | 0:14:35 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200379 | 2019-08-08 23:26:27 | 2019-08-10 14:07:39 | 2019-08-10 14:31:39 | 0:24:00 | 0:07:07 | 0:16:53 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi102 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200380 | 2019-08-08 23:26:28 | 2019-08-10 14:09:12 | 2019-08-10 14:43:11 | 0:33:59 | 0:12:44 | 0:21:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200381 | 2019-08-08 23:26:29 | 2019-08-10 14:09:12 | 2019-08-10 14:55:11 | 0:45:59 | 0:13:51 | 0:32:08 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi068 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200382 | 2019-08-08 23:26:30 | 2019-08-10 14:10:44 | 2019-08-10 15:12:44 | 1:02:00 | 0:50:23 | 0:11:37 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200383 | 2019-08-08 23:26:31 | 2019-08-10 14:10:51 | 2019-08-10 15:50:51 | 1:40:00 | 1:24:05 | 0:15:55 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
"2019-08-10T14:41:05.092930+0000 mon.b (mon.0) 158 : cluster [WRN] Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY)" in cluster log |
||||||||||||||
fail | 4200384 | 2019-08-08 23:26:31 | 2019-08-10 14:11:05 | 2019-08-10 14:45:05 | 0:34:00 | 0:13:38 | 0:20:22 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi180 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200385 | 2019-08-08 23:26:32 | 2019-08-10 14:13:06 | 2019-08-10 15:17:06 | 1:04:00 | 0:11:42 | 0:52:18 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200386 | 2019-08-08 23:26:33 | 2019-08-10 14:13:06 | 2019-08-10 15:09:06 | 0:56:00 | 0:06:54 | 0:49:06 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi204 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200387 | 2019-08-08 23:26:34 | 2019-08-10 14:13:09 | 2019-08-10 15:39:10 | 1:26:01 | 0:33:12 | 0:52:49 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200388 | 2019-08-08 23:26:35 | 2019-08-10 14:14:00 | 2019-08-10 14:33:59 | 0:19:59 | 0:06:51 | 0:13:08 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200389 | 2019-08-08 23:26:36 | 2019-08-10 14:15:27 | 2019-08-10 15:11:27 | 0:56:00 | 0:10:47 | 0:45:13 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200390 | 2019-08-08 23:26:37 | 2019-08-10 14:17:45 | 2019-08-10 14:39:44 | 0:21:59 | 0:06:04 | 0:15:55 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 2 | |
Failure Reason:
{'smithi005.front.sepia.ceph.com': {'msg': 'Failing rest of playbook due to missing NVMe card', 'changed': False, '_ansible_no_log': False}} |
||||||||||||||
fail | 4200391 | 2019-08-08 23:26:38 | 2019-08-10 14:17:59 | 2019-08-10 15:56:00 | 1:38:01 | 0:28:15 | 1:09:46 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T15:41:48.433766+0000 mon.a (mon.0) 978 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200392 | 2019-08-08 23:26:39 | 2019-08-10 14:19:27 | 2019-08-10 15:15:26 | 0:55:59 | 0:11:48 | 0:44:11 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200393 | 2019-08-08 23:26:39 | 2019-08-10 14:19:27 | 2019-08-10 14:51:26 | 0:31:59 | 0:14:36 | 0:17:23 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi102 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200394 | 2019-08-08 23:26:40 | 2019-08-10 14:20:02 | 2019-08-10 14:48:01 | 0:27:59 | 0:14:16 | 0:13:43 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi106 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200395 | 2019-08-08 23:26:41 | 2019-08-10 14:23:57 | 2019-08-10 14:53:56 | 0:29:59 | 0:15:28 | 0:14:31 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200396 | 2019-08-08 23:26:42 | 2019-08-10 14:24:01 | 2019-08-10 14:42:00 | 0:17:59 | 0:06:52 | 0:11:07 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200397 | 2019-08-08 23:26:43 | 2019-08-10 14:25:44 | 2019-08-10 15:13:44 | 0:48:00 | 0:21:48 | 0:26:12 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200398 | 2019-08-08 23:26:43 | 2019-08-10 14:25:44 | 2019-08-10 15:07:44 | 0:42:00 | 0:16:05 | 0:25:55 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200399 | 2019-08-08 23:26:44 | 2019-08-10 14:26:14 | 2019-08-10 14:56:14 | 0:30:00 | 0:12:50 | 0:17:10 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi195 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200400 | 2019-08-08 23:26:45 | 2019-08-10 14:27:12 | 2019-08-10 15:27:12 | 1:00:00 | 0:36:36 | 0:23:24 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4200401 | 2019-08-08 23:26:46 | 2019-08-10 14:27:48 | 2019-08-10 15:05:47 | 0:37:59 | 0:12:44 | 0:25:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200402 | 2019-08-08 23:26:47 | 2019-08-10 14:28:02 | 2019-08-10 15:04:02 | 0:36:00 | 0:12:18 | 0:23:42 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi099 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200403 | 2019-08-08 23:26:47 | 2019-08-10 14:28:36 | 2019-08-10 14:54:36 | 0:26:00 | 0:13:31 | 0:12:29 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi047 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200404 | 2019-08-08 23:26:48 | 2019-08-10 14:28:51 | 2019-08-10 15:16:50 | 0:47:59 | 0:12:02 | 0:35:57 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200405 | 2019-08-08 23:26:49 | 2019-08-10 14:29:32 | 2019-08-10 15:15:32 | 0:46:00 | 0:11:37 | 0:34:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi017 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200406 | 2019-08-08 23:26:50 | 2019-08-10 14:31:54 | 2019-08-10 15:17:54 | 0:46:00 | 0:10:36 | 0:35:24 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi143 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200407 | 2019-08-08 23:26:51 | 2019-08-10 14:33:42 | 2019-08-10 14:57:41 | 0:23:59 | 0:11:43 | 0:12:16 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200408 | 2019-08-08 23:26:52 | 2019-08-10 14:34:01 | 2019-08-10 17:16:02 | 2:42:01 | 2:17:40 | 0:24:21 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200409 | 2019-08-08 23:26:52 | 2019-08-10 14:34:07 | 2019-08-10 15:44:07 | 1:10:00 | 0:11:40 | 0:58:20 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200410 | 2019-08-08 23:26:53 | 2019-08-10 14:35:23 | 2019-08-10 15:21:23 | 0:46:00 | 0:06:53 | 0:39:07 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi099 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200411 | 2019-08-08 23:26:54 | 2019-08-10 14:36:11 | 2019-08-10 15:26:10 | 0:49:59 | 0:32:38 | 0:17:21 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4200412 | 2019-08-08 23:26:55 | 2019-08-10 14:39:46 | 2019-08-10 15:05:45 | 0:25:59 | 0:11:48 | 0:14:11 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi115 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200413 | 2019-08-08 23:26:56 | 2019-08-10 14:40:23 | 2019-08-10 15:36:23 | 0:56:00 | 0:21:07 | 0:34:53 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4200414 | 2019-08-08 23:26:57 | 2019-08-10 14:41:32 | 2019-08-10 15:35:32 | 0:54:00 | 0:11:27 | 0:42:33 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi103 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200415 | 2019-08-08 23:26:57 | 2019-08-10 14:42:02 | 2019-08-10 15:56:02 | 1:14:00 | 0:53:49 | 0:20:11 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4200416 | 2019-08-08 23:26:58 | 2019-08-10 14:42:05 | 2019-08-10 15:08:05 | 0:26:00 | 0:11:42 | 0:14:18 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200417 | 2019-08-08 23:26:59 | 2019-08-10 14:43:27 | 2019-08-10 15:17:26 | 0:33:59 | 0:16:58 | 0:17:01 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200418 | 2019-08-08 23:27:00 | 2019-08-10 14:43:27 | 2019-08-10 15:09:26 | 0:25:59 | 0:11:47 | 0:14:12 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi166 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200419 | 2019-08-08 23:27:01 | 2019-08-10 14:45:20 | 2019-08-10 15:59:20 | 1:14:00 | 0:11:18 | 1:02:42 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi025 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200420 | 2019-08-08 23:27:01 | 2019-08-10 14:46:48 | 2019-08-10 15:30:48 | 0:44:00 | 0:10:47 | 0:33:13 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi137 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200421 | 2019-08-08 23:27:02 | 2019-08-10 14:47:30 | 2019-08-10 15:19:29 | 0:31:59 | 0:12:32 | 0:19:27 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200422 | 2019-08-08 23:27:03 | 2019-08-10 14:48:15 | 2019-08-10 17:22:17 | 2:34:02 | 0:21:47 | 2:12:15 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200423 | 2019-08-08 23:27:04 | 2019-08-10 14:50:01 | 2019-08-10 15:22:01 | 0:32:00 | 0:17:21 | 0:14:39 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200424 | 2019-08-08 23:27:05 | 2019-08-10 14:51:42 | 2019-08-10 15:25:41 | 0:33:59 | 0:11:55 | 0:22:04 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200425 | 2019-08-08 23:27:06 | 2019-08-10 14:52:26 | 2019-08-10 15:26:26 | 0:34:00 | 0:10:57 | 0:23:03 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi204 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200426 | 2019-08-08 23:27:07 | 2019-08-10 14:53:26 | 2019-08-10 15:37:26 | 0:44:00 | 0:12:14 | 0:31:46 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
pass | 4200427 | 2019-08-08 23:27:08 | 2019-08-10 14:53:50 | 2019-08-10 16:49:51 | 1:56:01 | 1:31:20 | 0:24:41 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200428 | 2019-08-08 23:27:08 | 2019-08-10 14:53:58 | 2019-08-10 15:29:57 | 0:35:59 | 0:06:50 | 0:29:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200429 | 2019-08-08 23:27:09 | 2019-08-10 14:54:51 | 2019-08-10 15:40:51 | 0:46:00 | 0:10:34 | 0:35:26 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi120 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200430 | 2019-08-08 23:27:10 | 2019-08-10 14:55:13 | 2019-08-10 15:27:17 | 0:32:04 | 0:10:57 | 0:21:07 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi135 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200431 | 2019-08-08 23:27:11 | 2019-08-10 14:55:18 | 2019-08-10 15:59:23 | 1:04:05 | 0:53:04 | 0:11:01 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200432 | 2019-08-08 23:27:12 | 2019-08-10 14:55:43 | 2019-08-10 15:23:42 | 0:27:59 | 0:06:53 | 0:21:06 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi180 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200433 | 2019-08-08 23:27:12 | 2019-08-10 14:56:29 | 2019-08-10 15:40:28 | 0:43:59 | 0:10:44 | 0:33:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200434 | 2019-08-08 23:27:13 | 2019-08-10 14:57:12 | 2019-08-10 15:53:12 | 0:56:00 | 0:11:41 | 0:44:19 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200435 | 2019-08-08 23:27:14 | 2019-08-10 14:57:55 | 2019-08-10 15:37:55 | 0:40:00 | 0:06:52 | 0:33:08 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200436 | 2019-08-08 23:27:15 | 2019-08-10 15:00:19 | 2019-08-10 15:52:19 | 0:52:00 | 0:35:39 | 0:16:21 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200437 | 2019-08-08 23:27:16 | 2019-08-10 15:02:05 | 2019-08-10 16:22:11 | 1:20:06 | 0:16:33 | 1:03:33 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200438 | 2019-08-08 23:27:16 | 2019-08-10 15:02:39 | 2019-08-10 15:20:38 | 0:17:59 | 0:10:43 | 0:07:16 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi009 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200439 | 2019-08-08 23:27:17 | 2019-08-10 15:04:04 | 2019-08-10 15:30:03 | 0:25:59 | 0:07:07 | 0:18:52 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi094 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200440 | 2019-08-08 23:27:18 | 2019-08-10 15:06:07 | 2019-08-10 15:42:06 | 0:35:59 | 0:22:24 | 0:13:35 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T15:32:15.040594+0000 mon.a (mon.0) 1046 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4200441 | 2019-08-08 23:27:19 | 2019-08-10 15:06:07 | 2019-08-10 17:34:08 | 2:28:01 | 0:40:24 | 1:47:37 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4200442 | 2019-08-08 23:27:20 | 2019-08-10 15:06:27 | 2019-08-10 16:16:27 | 1:10:00 | 0:14:33 | 0:55:27 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi173 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200443 | 2019-08-08 23:27:20 | 2019-08-10 15:07:49 | 2019-08-10 17:01:50 | 1:54:01 | 0:12:22 | 1:41:39 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi077 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200444 | 2019-08-08 23:27:21 | 2019-08-10 15:07:50 | 2019-08-10 15:37:49 | 0:29:59 | 0:15:53 | 0:14:06 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200445 | 2019-08-08 23:27:22 | 2019-08-10 15:08:06 | 2019-08-10 15:34:05 | 0:25:59 | 0:07:01 | 0:18:58 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200446 | 2019-08-08 23:27:23 | 2019-08-10 15:08:11 | 2019-08-10 15:44:11 | 0:36:00 | 0:22:31 | 0:13:29 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200447 | 2019-08-08 23:27:24 | 2019-08-10 15:09:22 | 2019-08-10 15:51:22 | 0:42:00 | 0:16:33 | 0:25:27 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200448 | 2019-08-08 23:27:25 | 2019-08-10 15:09:28 | 2019-08-10 16:09:28 | 1:00:00 | 0:12:04 | 0:47:56 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi109 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200449 | 2019-08-08 23:27:25 | 2019-08-10 15:10:52 | 2019-08-10 16:18:52 | 1:08:00 | 0:38:38 | 0:29:22 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4200450 | 2019-08-08 23:27:26 | 2019-08-10 15:11:28 | 2019-08-10 16:45:29 | 1:34:01 | 0:18:25 | 1:15:36 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
pass | 4200451 | 2019-08-08 23:27:27 | 2019-08-10 15:11:30 | 2019-08-10 15:35:30 | 0:24:00 | 0:12:30 | 0:11:30 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4200452 | 2019-08-08 23:27:28 | 2019-08-10 15:12:23 | 2019-08-10 17:22:24 | 2:10:01 | 0:13:19 | 1:56:42 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi068 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200453 | 2019-08-08 23:27:29 | 2019-08-10 15:12:46 | 2019-08-10 15:40:45 | 0:27:59 | 0:11:35 | 0:16:24 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
pass | 4200454 | 2019-08-08 23:27:30 | 2019-08-10 15:13:50 | 2019-08-10 15:35:49 | 0:21:59 | 0:13:16 | 0:08:43 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
fail | 4200455 | 2019-08-08 23:27:30 | 2019-08-10 15:13:50 | 2019-08-10 15:57:49 | 0:43:59 | 0:14:15 | 0:29:44 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200456 | 2019-08-08 23:27:31 | 2019-08-10 15:13:50 | 2019-08-10 15:33:49 | 0:19:59 | 0:10:43 | 0:09:16 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi139 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200457 | 2019-08-08 23:27:32 | 2019-08-10 15:15:18 | 2019-08-10 17:49:19 | 2:34:01 | 2:02:11 | 0:31:50 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200458 | 2019-08-08 23:27:33 | 2019-08-10 15:15:28 | 2019-08-10 15:57:27 | 0:41:59 | 0:11:36 | 0:30:23 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200459 | 2019-08-08 23:27:34 | 2019-08-10 15:15:34 | 2019-08-10 15:53:33 | 0:37:59 | 0:07:03 | 0:30:56 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi114 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200460 | 2019-08-08 23:27:34 | 2019-08-10 15:17:05 | 2019-08-10 16:07:05 | 0:50:00 | 0:37:53 | 0:12:07 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4200461 | 2019-08-08 23:27:35 | 2019-08-10 15:17:08 | 2019-08-10 16:13:08 | 0:56:00 | 0:12:52 | 0:43:08 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi169 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200462 | 2019-08-08 23:27:36 | 2019-08-10 15:17:28 | 2019-08-10 15:59:28 | 0:42:00 | 0:22:31 | 0:19:29 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4200463 | 2019-08-08 23:27:37 | 2019-08-10 15:17:56 | 2019-08-10 15:57:55 | 0:39:59 | 0:18:32 | 0:21:27 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200464 | 2019-08-08 23:27:38 | 2019-08-10 15:19:31 | 2019-08-10 16:31:31 | 1:12:00 | 0:18:06 | 0:53:54 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi196 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200465 | 2019-08-08 23:27:39 | 2019-08-10 15:20:53 | 2019-08-10 15:40:52 | 0:19:59 | 0:10:23 | 0:09:36 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200466 | 2019-08-08 23:27:39 | 2019-08-10 15:21:25 | 2019-08-10 16:11:24 | 0:49:59 | 0:16:05 | 0:33:54 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200467 | 2019-08-08 23:27:40 | 2019-08-10 15:22:19 | 2019-08-10 16:16:18 | 0:53:59 | 0:14:34 | 0:39:25 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi144 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200468 | 2019-08-08 23:27:41 | 2019-08-10 15:23:58 | 2019-08-10 16:01:58 | 0:38:00 | 0:27:05 | 0:10:55 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
fail | 4200469 | 2019-08-08 23:27:42 | 2019-08-10 15:24:28 | 2019-08-10 15:56:27 | 0:31:59 | 0:14:21 | 0:17:38 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi194 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200470 | 2019-08-08 23:27:43 | 2019-08-10 15:24:41 | 2019-08-10 15:56:45 | 0:32:04 | 0:12:57 | 0:19:07 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200471 | 2019-08-08 23:27:44 | 2019-08-10 15:25:57 | 2019-08-10 16:01:56 | 0:35:59 | 0:16:57 | 0:19:02 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200472 | 2019-08-08 23:27:45 | 2019-08-10 15:26:12 | 2019-08-10 16:42:12 | 1:16:00 | 0:51:50 | 0:24:10 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200473 | 2019-08-08 23:27:45 | 2019-08-10 15:26:27 | 2019-08-10 15:44:26 | 0:17:59 | 0:07:04 | 0:10:55 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi135 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200474 | 2019-08-08 23:27:46 | 2019-08-10 15:27:27 | 2019-08-10 15:57:26 | 0:29:59 | 0:10:31 | 0:19:28 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200475 | 2019-08-08 23:27:47 | 2019-08-10 15:27:27 | 2019-08-10 15:57:27 | 0:30:00 | 0:15:17 | 0:14:43 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi152 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200476 | 2019-08-08 23:27:48 | 2019-08-10 15:30:13 | 2019-08-10 16:42:13 | 1:12:00 | 0:49:07 | 0:22:53 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200477 | 2019-08-08 23:27:49 | 2019-08-10 15:30:13 | 2019-08-10 15:54:12 | 0:23:59 | 0:06:56 | 0:17:03 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi169 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200478 | 2019-08-08 23:27:50 | 2019-08-10 15:30:49 | 2019-08-10 16:12:49 | 0:42:00 | 0:13:10 | 0:28:50 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi090 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200479 | 2019-08-08 23:27:50 | 2019-08-10 15:31:13 | 2019-08-10 15:53:12 | 0:21:59 | 0:14:15 | 0:07:44 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200480 | 2019-08-08 23:27:51 | 2019-08-10 15:34:05 | 2019-08-10 16:56:05 | 1:22:00 | 0:51:19 | 0:30:41 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200481 | 2019-08-08 23:27:52 | 2019-08-10 15:34:07 | 2019-08-10 16:04:06 | 0:29:59 | 0:06:52 | 0:23:07 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi196 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200482 | 2019-08-08 23:27:53 | 2019-08-10 15:35:45 | 2019-08-10 16:41:45 | 1:06:00 | 0:13:44 | 0:52:16 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200483 | 2019-08-08 23:27:54 | 2019-08-10 15:35:45 | 2019-08-10 16:19:45 | 0:44:00 | 0:13:01 | 0:30:59 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200484 | 2019-08-08 23:27:54 | 2019-08-10 15:35:51 | 2019-08-10 16:27:50 | 0:51:59 | 0:06:59 | 0:45:00 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi167 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200485 | 2019-08-08 23:27:55 | 2019-08-10 15:36:25 | 2019-08-10 16:14:24 | 0:37:59 | 0:23:13 | 0:14:46 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200486 | 2019-08-08 23:27:56 | 2019-08-10 15:37:42 | 2019-08-10 16:01:41 | 0:23:59 | 0:11:04 | 0:12:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi158 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200487 | 2019-08-08 23:27:57 | 2019-08-10 15:37:50 | 2019-08-10 16:09:50 | 0:32:00 | 0:11:45 | 0:20:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi132 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200488 | 2019-08-08 23:27:58 | 2019-08-10 15:37:56 | 2019-08-10 16:09:55 | 0:31:59 | 0:07:10 | 0:24:49 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200489 | 2019-08-08 23:27:58 | 2019-08-10 15:39:25 | 2019-08-10 16:17:25 | 0:38:00 | 0:26:43 | 0:11:17 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T16:04:13.199468+0000 mon.a (mon.0) 998 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4200491 | 2019-08-08 23:27:59 | 2019-08-10 15:40:44 | 2019-08-10 16:32:43 | 0:51:59 | 0:41:06 | 0:10:53 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4200493 | 2019-08-08 23:28:00 | 2019-08-10 15:40:47 | 2019-08-10 16:30:46 | 0:49:59 | 0:16:57 | 0:33:02 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200494 | 2019-08-08 23:28:01 | 2019-08-10 15:40:52 | 2019-08-10 16:32:52 | 0:52:00 | 0:17:23 | 0:34:37 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi018 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200496 | 2019-08-08 23:28:02 | 2019-08-10 15:40:53 | 2019-08-10 16:36:53 | 0:56:00 | 0:14:28 | 0:41:32 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200498 | 2019-08-08 23:28:03 | 2019-08-10 15:42:22 | 2019-08-10 16:08:21 | 0:25:59 | 0:06:50 | 0:19:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi086 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200500 | 2019-08-08 23:28:03 | 2019-08-10 15:44:22 | 2019-08-10 16:42:21 | 0:57:59 | 0:18:10 | 0:39:49 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200502 | 2019-08-08 23:28:04 | 2019-08-10 15:44:22 | 2019-08-10 17:16:22 | 1:32:00 | 0:50:38 | 0:41:22 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200503 | 2019-08-08 23:28:05 | 2019-08-10 15:44:28 | 2019-08-10 16:18:32 | 0:34:04 | 0:15:18 | 0:18:46 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200505 | 2019-08-08 23:28:06 | 2019-08-10 15:48:15 | 2019-08-10 18:50:17 | 3:02:02 | 0:39:56 | 2:22:06 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4200507 | 2019-08-08 23:28:07 | 2019-08-10 15:51:07 | 2019-08-10 16:23:06 | 0:31:59 | 0:15:43 | 0:16:16 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200509 | 2019-08-08 23:28:07 | 2019-08-10 15:51:23 | 2019-08-10 16:09:22 | 0:17:59 | 0:06:59 | 0:11:00 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi114 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200511 | 2019-08-08 23:28:08 | 2019-08-10 15:52:36 | 2019-08-10 16:20:34 | 0:27:58 | 0:15:34 | 0:12:24 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi103 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200512 | 2019-08-08 23:28:09 | 2019-08-10 15:53:13 | 2019-08-10 17:27:13 | 1:34:00 | 0:11:11 | 1:22:49 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
pass | 4200514 | 2019-08-08 23:28:10 | 2019-08-10 15:53:13 | 2019-08-10 17:05:13 | 1:12:00 | 0:14:05 | 0:57:55 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
fail | 4200516 | 2019-08-08 23:28:11 | 2019-08-10 15:53:35 | 2019-08-10 16:37:35 | 0:44:00 | 0:17:26 | 0:26:34 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200518 | 2019-08-08 23:28:11 | 2019-08-10 15:54:14 | 2019-08-10 16:22:13 | 0:27:59 | 0:15:46 | 0:12:13 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi084 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200520 | 2019-08-08 23:28:12 | 2019-08-10 15:56:01 | 2019-08-10 18:12:02 | 2:16:01 | 2:03:57 | 0:12:04 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200521 | 2019-08-08 23:28:13 | 2019-08-10 15:56:04 | 2019-08-10 17:22:04 | 1:26:00 | 0:12:32 | 1:13:28 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200523 | 2019-08-08 23:28:14 | 2019-08-10 15:56:29 | 2019-08-10 16:20:28 | 0:23:59 | 0:06:55 | 0:17:04 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi198 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200525 | 2019-08-08 23:28:15 | 2019-08-10 15:56:59 | 2019-08-10 16:50:59 | 0:54:00 | 0:07:04 | 0:46:56 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi018 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200527 | 2019-08-08 23:28:16 | 2019-08-10 15:57:24 | 2019-08-10 16:33:23 | 0:35:59 | 0:17:19 | 0:18:40 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200529 | 2019-08-08 23:28:16 | 2019-08-10 15:57:28 | 2019-08-10 16:31:27 | 0:33:59 | 0:22:06 | 0:11:53 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4200530 | 2019-08-08 23:28:17 | 2019-08-10 15:57:28 | 2019-08-10 16:27:27 | 0:29:59 | 0:17:31 | 0:12:28 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi053 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200532 | 2019-08-08 23:28:18 | 2019-08-10 15:57:29 | 2019-08-10 17:25:29 | 1:28:00 | 0:53:20 | 0:34:40 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4200534 | 2019-08-08 23:28:19 | 2019-08-10 15:57:51 | 2019-08-10 16:23:50 | 0:25:59 | 0:16:18 | 0:09:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi141 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200536 | 2019-08-08 23:28:20 | 2019-08-10 15:57:57 | 2019-08-10 17:03:56 | 1:05:59 | 0:17:20 | 0:48:39 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200538 | 2019-08-08 23:28:21 | 2019-08-10 15:59:36 | 2019-08-10 17:29:36 | 1:30:00 | 0:10:55 | 1:19:05 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200539 | 2019-08-08 23:28:21 | 2019-08-10 15:59:36 | 2019-08-10 16:45:35 | 0:45:59 | 0:12:18 | 0:33:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi141 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200541 | 2019-08-08 23:28:22 | 2019-08-10 15:59:36 | 2019-08-10 18:03:37 | 2:04:01 | 0:18:08 | 1:45:53 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi026 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200543 | 2019-08-08 23:28:23 | 2019-08-10 16:01:57 | 2019-08-10 17:55:57 | 1:54:00 | 0:12:47 | 1:41:13 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200545 | 2019-08-08 23:28:24 | 2019-08-10 16:01:58 | 2019-08-10 16:43:57 | 0:41:59 | 0:16:49 | 0:25:10 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200547 | 2019-08-08 23:28:25 | 2019-08-10 16:01:59 | 2019-08-10 17:01:59 | 1:00:00 | 0:45:57 | 0:14:03 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200548 | 2019-08-08 23:28:25 | 2019-08-10 16:04:23 | 2019-08-10 16:26:22 | 0:21:59 | 0:06:47 | 0:15:12 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200550 | 2019-08-08 23:28:26 | 2019-08-10 16:07:20 | 2019-08-10 16:45:20 | 0:38:00 | 0:10:35 | 0:27:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200552 | 2019-08-08 23:28:27 | 2019-08-10 16:08:38 | 2019-08-10 16:54:37 | 0:45:59 | 0:12:47 | 0:33:12 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
pass | 4200554 | 2019-08-08 23:28:28 | 2019-08-10 16:09:28 | 2019-08-10 18:57:30 | 2:48:02 | 1:13:20 | 1:34:42 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200556 | 2019-08-08 23:28:28 | 2019-08-10 16:09:31 | 2019-08-10 16:39:30 | 0:29:59 | 0:16:51 | 0:13:08 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi026 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200557 | 2019-08-08 23:28:29 | 2019-08-10 16:10:06 | 2019-08-10 16:36:06 | 0:26:00 | 0:17:39 | 0:08:21 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi195 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200559 | 2019-08-08 23:28:30 | 2019-08-10 16:10:06 | 2019-08-10 16:36:06 | 0:26:00 | 0:17:41 | 0:08:19 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi049 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200561 | 2019-08-08 23:28:31 | 2019-08-10 16:11:27 | 2019-08-10 18:49:29 | 2:38:02 | 0:50:01 | 1:48:01 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200563 | 2019-08-08 23:28:32 | 2019-08-10 16:13:06 | 2019-08-10 16:31:06 | 0:18:00 | 0:06:55 | 0:11:05 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi078 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200564 | 2019-08-08 23:28:32 | 2019-08-10 16:13:10 | 2019-08-10 16:39:09 | 0:25:59 | 0:16:28 | 0:09:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi203 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200566 | 2019-08-08 23:28:33 | 2019-08-10 16:14:02 | 2019-08-10 16:46:01 | 0:31:59 | 0:13:11 | 0:18:48 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200568 | 2019-08-08 23:28:34 | 2019-08-10 16:14:42 | 2019-08-10 16:54:41 | 0:39:59 | 0:06:51 | 0:33:08 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi145 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200570 | 2019-08-08 23:28:35 | 2019-08-10 16:16:30 | 2019-08-10 16:56:30 | 0:40:00 | 0:26:55 | 0:13:05 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200572 | 2019-08-08 23:28:36 | 2019-08-10 16:16:30 | 2019-08-10 16:42:30 | 0:26:00 | 0:14:05 | 0:11:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi198 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200573 | 2019-08-08 23:28:36 | 2019-08-10 16:16:30 | 2019-08-10 16:40:30 | 0:24:00 | 0:15:03 | 0:08:57 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200575 | 2019-08-08 23:28:37 | 2019-08-10 16:17:27 | 2019-08-10 17:23:28 | 1:06:01 | 0:07:00 | 0:59:01 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi091 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200577 | 2019-08-08 23:28:38 | 2019-08-10 16:18:51 | 2019-08-10 16:58:51 | 0:40:00 | 0:22:24 | 0:17:36 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T16:50:48.665045+0000 mon.b (mon.0) 804 : cluster [WRN] Health check failed: 3 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4200579 | 2019-08-08 23:28:39 | 2019-08-10 16:18:55 | 2019-08-10 17:14:55 | 0:56:00 | 0:45:04 | 0:10:56 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4200581 | 2019-08-08 23:28:40 | 2019-08-10 16:19:14 | 2019-08-10 16:41:14 | 0:22:00 | 0:14:33 | 0:07:27 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi116 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200583 | 2019-08-08 23:28:40 | 2019-08-10 16:19:47 | 2019-08-10 17:13:47 | 0:54:00 | 0:12:42 | 0:41:18 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi050 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200584 | 2019-08-08 23:28:41 | 2019-08-10 16:20:45 | 2019-08-10 16:54:45 | 0:34:00 | 0:16:52 | 0:17:08 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200586 | 2019-08-08 23:28:42 | 2019-08-10 16:20:45 | 2019-08-10 17:06:45 | 0:46:00 | 0:12:37 | 0:33:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi197 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200588 | 2019-08-08 23:28:43 | 2019-08-10 16:22:28 | 2019-08-10 18:12:30 | 1:50:02 | 0:17:52 | 1:32:10 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200590 | 2019-08-08 23:28:44 | 2019-08-10 16:22:29 | 2019-08-10 17:32:29 | 1:10:00 | 0:51:13 | 0:18:47 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200592 | 2019-08-08 23:28:45 | 2019-08-10 16:23:09 | 2019-08-10 17:15:08 | 0:51:59 | 0:13:28 | 0:38:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi132 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200593 | 2019-08-08 23:28:45 | 2019-08-10 16:24:09 | 2019-08-10 17:36:10 | 1:12:01 | 0:51:05 | 0:20:56 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4200595 | 2019-08-08 23:28:46 | 2019-08-10 16:26:35 | 2019-08-10 16:54:35 | 0:28:00 | 0:16:23 | 0:11:37 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200597 | 2019-08-08 23:28:47 | 2019-08-10 16:27:30 | 2019-08-10 16:47:29 | 0:19:59 | 0:06:50 | 0:13:09 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi078 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200599 | 2019-08-08 23:28:48 | 2019-08-10 16:28:10 | 2019-08-10 16:54:09 | 0:25:59 | 0:10:35 | 0:15:24 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi050 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200601 | 2019-08-08 23:28:49 | 2019-08-10 16:31:04 | 2019-08-10 16:59:04 | 0:28:00 | 0:11:28 | 0:16:32 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200602 | 2019-08-08 23:28:49 | 2019-08-10 16:31:07 | 2019-08-10 17:07:07 | 0:36:00 | 0:06:53 | 0:29:07 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi036 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200604 | 2019-08-08 23:28:50 | 2019-08-10 16:31:30 | 2019-08-10 16:49:29 | 0:17:59 | 0:10:23 | 0:07:36 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi093 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200606 | 2019-08-08 23:28:51 | 2019-08-10 16:31:34 | 2019-08-10 16:59:33 | 0:27:59 | 0:11:58 | 0:16:01 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200608 | 2019-08-08 23:28:52 | 2019-08-10 16:33:02 | 2019-08-10 18:49:03 | 2:16:01 | 1:57:45 | 0:18:16 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200610 | 2019-08-08 23:28:53 | 2019-08-10 16:33:02 | 2019-08-10 17:09:01 | 0:35:59 | 0:11:58 | 0:24:01 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200611 | 2019-08-08 23:28:54 | 2019-08-10 16:33:26 | 2019-08-10 16:53:25 | 0:19:59 | 0:06:56 | 0:13:03 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi087 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200613 | 2019-08-08 23:28:54 | 2019-08-10 16:34:39 | 2019-08-10 17:02:38 | 0:27:59 | 0:12:53 | 0:15:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200615 | 2019-08-08 23:28:55 | 2019-08-10 16:36:22 | 2019-08-10 17:00:21 | 0:23:59 | 0:12:30 | 0:11:29 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200617 | 2019-08-08 23:28:56 | 2019-08-10 16:36:22 | 2019-08-10 17:14:22 | 0:38:00 | 0:21:57 | 0:16:03 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4200619 | 2019-08-08 23:28:57 | 2019-08-10 16:36:56 | 2019-08-10 16:52:55 | 0:15:59 | 0:05:57 | 0:10:02 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
ng : 10:qemu-img-1.5.3-160.el7_6.2.x86_64 2/4 \n Erasing : 10:qemu-kvm-common-1.5.3-160.el7_6.2.x86_64 3/4 \n Erasing : 1:librbd1-10.2.5-4.el7.x86_64 4/4 \n Verifying : 1:librbd1-10.2.5-4.el7.x86_64 1/4 \n Verifying : 10:qemu-img-1.5.3-160.el7_6.2.x86_64 2/4 \n Verifying : 10:qemu-kvm-1.5.3-160.el7_6.2.x86_64 3/4 \n Verifying : 10:qemu-kvm-common-1.5.3-160.el7_6.2.x86_64 4/4 \n\nRemoved:\n librbd1.x86_64 1:10.2.5-4.el7 \n\nDependency Removed:\n qemu-img.x86_64 10:1.5.3-160.el7_6.2 \n qemu-kvm.x86_64 10:1.5.3-160.el7_6.2 \n qemu-kvm-common.x86_64 10:1.5.3-160.el7_6.2 \n\nComplete!\n'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'librbd1'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'librbd1', u'rc': 0, u'msg': u'', u'changes': {u'removed': [u'librbd1']}, '_ansible_no_log': False}, {'ansible_loop_var': u'item', '_ansible_item_label': u'librados2', 'failed': False, u'changed': True, u'results': [u'Loaded plugins: fastestmirror, langpacks, priorities\nResolving Dependencies\n--> Running transaction check\n---> Package librados2.x86_64 1:10.2.5-4.el7 will be erased\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nRemoving:\n librados2 x86_64 1:10.2.5-4.el7 @base 5.5 M\n\nTransaction Summary\n================================================================================\nRemove 1 Package\n\nInstalled size: 5.5 M\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Erasing : 1:librados2-10.2.5-4.el7.x86_64 1/1 \n Verifying : 1:librados2-10.2.5-4.el7.x86_64 1/1 \n\nRemoved:\n librados2.x86_64 1:10.2.5-4.el7 \n\nComplete!\n'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'librados2'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'librados2', u'rc': 0, u'msg': u'', u'changes': {u'removed': [u'librados2']}, '_ansible_no_log': False}, {'ansible_loop_var': u'item', '_ansible_item_label': u'mod_fastcgi', 'failed': False, u'changed': True, u'results': [u'Loaded plugins: fastestmirror, langpacks, priorities\nResolving Dependencies\n--> Running transaction check\n---> Package mod_fastcgi.x86_64 0:2.4.7-1.ceph.el7.centos will be erased\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nRemoving:\n mod_fastcgi x86_64 2.4.7-1.ceph.el7.centos @centos7-fcgi-ceph 213 k\n\nTransaction Summary\n================================================================================\nRemove 1 Package\n\nInstalled size: 213 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Erasing : mod_fastcgi-2.4.7-1.ceph.el7.centos.x86_64 1/1 \nwarning: file /var/run/mod_fastcgi: remove failed: No such file or directory\n Verifying : mod_fastcgi-2.4.7-1.ceph.el7.centos.x86_64 1/1 \n\nRemoved:\n mod_fastcgi.x86_64 0:2.4.7-1.ceph.el7.centos \n\nComplete!\n'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'mod_fastcgi'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'mod_fastcgi', u'rc': 0, u'msg': u'', u'changes': {u'removed': [u'mod_fastcgi']}, '_ansible_no_log': False}, {'ansible_loop_var': u'item', '_ansible_item_label': u'iozone', 'failed': False, u'changed': True, u'results': [u'Loaded plugins: fastestmirror, langpacks, priorities\nResolving Dependencies\n--> Running transaction check\n---> Package iozone.x86_64 0:3.424-2_ceph.el7.centos will be erased\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nRemoving:\n iozone x86_64 3.424-2_ceph.el7.centos @lab-extras 1.3 M\n\nTransaction Summary\n================================================================================\nRemove 1 Package\n\nInstalled size: 1.3 M\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Erasing : iozone-3.424-2_ceph.el7.centos.x86_64 1/1 \n Verifying : iozone-3.424-2_ceph.el7.centos.x86_64 1/1 \n\nRemoved:\n iozone.x86_64 0:3.424-2_ceph.el7.centos \n\nComplete!\n'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'iozone'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'iozone', u'rc': 0, u'msg': u'', u'changes': {u'removed': [u'iozone']}, '_ansible_no_log': False}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'ceph') |
||||||||||||||
pass | 4200620 | 2019-08-08 23:28:58 | 2019-08-10 16:37:55 | 2019-08-10 18:15:56 | 1:38:01 | 0:50:25 | 0:47:36 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4200622 | 2019-08-08 23:28:58 | 2019-08-10 16:39:25 | 2019-08-10 16:59:24 | 0:19:59 | 0:11:40 | 0:08:19 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200624 | 2019-08-08 23:28:59 | 2019-08-10 16:39:31 | 2019-08-10 17:21:31 | 0:42:00 | 0:16:59 | 0:25:01 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200626 | 2019-08-08 23:29:00 | 2019-08-10 16:40:45 | 2019-08-10 17:04:44 | 0:23:59 | 0:12:07 | 0:11:52 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi058 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200628 | 2019-08-08 23:29:01 | 2019-08-10 16:41:15 | 2019-08-10 16:59:14 | 0:17:59 | 0:06:56 | 0:11:03 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi090 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200630 | 2019-08-08 23:29:02 | 2019-08-10 16:42:03 | 2019-08-10 17:12:02 | 0:29:59 | 0:12:14 | 0:17:45 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi059 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200631 | 2019-08-08 23:29:03 | 2019-08-10 16:42:15 | 2019-08-10 17:12:14 | 0:29:59 | 0:13:05 | 0:16:54 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200633 | 2019-08-08 23:29:03 | 2019-08-10 16:42:16 | 2019-08-10 17:42:16 | 1:00:00 | 0:16:44 | 0:43:16 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200635 | 2019-08-08 23:29:04 | 2019-08-10 16:42:24 | 2019-08-10 17:48:24 | 1:06:00 | 0:54:06 | 0:11:54 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200637 | 2019-08-08 23:29:05 | 2019-08-10 16:42:31 | 2019-08-10 18:02:31 | 1:20:00 | 0:18:35 | 1:01:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200639 | 2019-08-08 23:29:06 | 2019-08-10 16:44:10 | 2019-08-10 17:08:08 | 0:23:58 | 0:11:47 | 0:12:11 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi063 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200640 | 2019-08-08 23:29:07 | 2019-08-10 16:45:40 | 2019-08-10 17:47:40 | 1:02:00 | 0:13:00 | 0:49:00 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200642 | 2019-08-08 23:29:08 | 2019-08-10 16:45:41 | 2019-08-10 18:09:41 | 1:24:00 | 0:46:11 | 0:37:49 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200644 | 2019-08-08 23:29:09 | 2019-08-10 16:45:41 | 2019-08-10 18:39:41 | 1:54:00 | 0:18:27 | 1:35:33 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi083 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200646 | 2019-08-08 23:29:10 | 2019-08-10 16:46:03 | 2019-08-10 17:22:02 | 0:35:59 | 0:13:58 | 0:22:01 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi086 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200647 | 2019-08-08 23:29:10 | 2019-08-10 16:47:49 | 2019-08-10 17:19:48 | 0:31:59 | 0:13:59 | 0:18:00 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200649 | 2019-08-08 23:29:11 | 2019-08-10 16:49:31 | 2019-08-10 18:31:32 | 1:42:01 | 0:50:50 | 0:51:11 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200651 | 2019-08-08 23:29:12 | 2019-08-10 16:49:36 | 2019-08-10 17:17:35 | 0:27:59 | 0:14:34 | 0:13:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi195 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200653 | 2019-08-08 23:29:13 | 2019-08-10 16:49:53 | 2019-08-10 17:19:53 | 0:30:00 | 0:14:06 | 0:15:54 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200654 | 2019-08-08 23:29:14 | 2019-08-10 16:51:17 | 2019-08-10 17:17:16 | 0:25:59 | 0:12:58 | 0:13:01 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200656 | 2019-08-08 23:29:15 | 2019-08-10 16:52:57 | 2019-08-10 17:12:56 | 0:19:59 | 0:06:59 | 0:13:00 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi029 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200658 | 2019-08-08 23:29:16 | 2019-08-10 16:53:27 | 2019-08-10 17:33:27 | 0:40:00 | 0:24:17 | 0:15:43 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200660 | 2019-08-08 23:29:16 | 2019-08-10 16:54:24 | 2019-08-10 17:38:24 | 0:44:00 | 0:07:04 | 0:36:56 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi068 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200662 | 2019-08-08 23:29:17 | 2019-08-10 16:54:36 | 2019-08-10 17:18:35 | 0:23:59 | 0:14:02 | 0:09:57 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi049 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200663 | 2019-08-08 23:29:18 | 2019-08-10 16:54:41 | 2019-08-10 18:06:41 | 1:12:00 | 0:17:55 | 0:54:05 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi068 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200665 | 2019-08-08 23:29:19 | 2019-08-10 16:54:43 | 2019-08-10 17:32:42 | 0:37:59 | 0:26:45 | 0:11:14 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T17:20:06.084878+0000 mon.a (mon.0) 742 : cluster [WRN] Health check failed: 3 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200667 | 2019-08-08 23:29:20 | 2019-08-10 16:54:46 | 2019-08-10 17:16:45 | 0:21:59 | 0:06:50 | 0:15:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi106 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200669 | 2019-08-08 23:29:21 | 2019-08-10 16:56:23 | 2019-08-10 17:22:23 | 0:26:00 | 0:13:23 | 0:12:37 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200671 | 2019-08-08 23:29:22 | 2019-08-10 16:56:31 | 2019-08-10 17:22:31 | 0:26:00 | 0:12:11 | 0:13:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi144 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200672 | 2019-08-08 23:29:22 | 2019-08-10 16:59:05 | 2019-08-10 17:25:05 | 0:26:00 | 0:15:01 | 0:10:59 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200674 | 2019-08-08 23:29:23 | 2019-08-10 16:59:05 | 2019-08-10 17:57:05 | 0:58:00 | 0:17:01 | 0:40:59 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200676 | 2019-08-08 23:29:24 | 2019-08-10 16:59:16 | 2019-08-10 18:07:16 | 1:08:00 | 0:24:18 | 0:43:42 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200678 | 2019-08-08 23:29:25 | 2019-08-10 16:59:26 | 2019-08-10 17:59:26 | 1:00:00 | 0:15:19 | 0:44:41 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200680 | 2019-08-08 23:29:26 | 2019-08-10 16:59:35 | 2019-08-10 17:41:34 | 0:41:59 | 0:13:30 | 0:28:29 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi133 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200681 | 2019-08-08 23:29:27 | 2019-08-10 17:00:36 | 2019-08-10 18:00:36 | 1:00:00 | 0:31:29 | 0:28:31 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4200683 | 2019-08-08 23:29:27 | 2019-08-10 17:02:08 | 2019-08-10 17:30:07 | 0:27:59 | 0:15:29 | 0:12:30 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
pass | 4200685 | 2019-08-08 23:29:28 | 2019-08-10 17:02:08 | 2019-08-10 17:34:08 | 0:32:00 | 0:13:25 | 0:18:35 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4200687 | 2019-08-08 23:29:29 | 2019-08-10 17:02:32 | 2019-08-10 17:42:32 | 0:40:00 | 0:12:37 | 0:27:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi144 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200689 | 2019-08-08 23:29:30 | 2019-08-10 17:02:39 | 2019-08-10 17:30:39 | 0:28:00 | 0:11:33 | 0:16:27 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200691 | 2019-08-08 23:29:31 | 2019-08-10 17:04:01 | 2019-08-10 17:22:00 | 0:17:59 | 0:06:49 | 0:11:10 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi141 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200692 | 2019-08-08 23:29:32 | 2019-08-10 17:04:46 | 2019-08-10 17:40:46 | 0:36:00 | 0:13:35 | 0:22:25 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200694 | 2019-08-08 23:29:33 | 2019-08-10 17:05:31 | 2019-08-10 17:33:31 | 0:28:00 | 0:12:11 | 0:15:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi066 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200696 | 2019-08-08 23:29:33 | 2019-08-10 17:07:01 | 2019-08-10 19:23:02 | 2:16:01 | 2:00:12 | 0:15:49 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200698 | 2019-08-08 23:29:34 | 2019-08-10 17:07:08 | 2019-08-10 17:29:07 | 0:21:59 | 0:11:57 | 0:10:02 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200699 | 2019-08-08 23:29:35 | 2019-08-10 17:07:51 | 2019-08-10 17:31:50 | 0:23:59 | 0:06:52 | 0:17:07 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi103 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200701 | 2019-08-08 23:29:36 | 2019-08-10 17:08:24 | 2019-08-10 17:34:23 | 0:25:59 | 0:13:36 | 0:12:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi203 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200703 | 2019-08-08 23:29:37 | 2019-08-10 17:09:03 | 2019-08-10 17:57:03 | 0:48:00 | 0:16:02 | 0:31:58 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi124 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200705 | 2019-08-08 23:29:37 | 2019-08-10 17:12:18 | 2019-08-10 17:46:17 | 0:33:59 | 0:20:40 | 0:13:19 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4200707 | 2019-08-08 23:29:38 | 2019-08-10 17:12:18 | 2019-08-10 17:52:17 | 0:39:59 | 0:21:16 | 0:18:43 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200709 | 2019-08-08 23:29:39 | 2019-08-10 17:12:57 | 2019-08-10 17:40:57 | 0:28:00 | 0:14:04 | 0:13:56 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200710 | 2019-08-08 23:29:40 | 2019-08-10 17:13:56 | 2019-08-10 17:39:55 | 0:25:59 | 0:13:39 | 0:12:20 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi198 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200712 | 2019-08-08 23:29:41 | 2019-08-10 17:14:23 | 2019-08-10 17:48:23 | 0:34:00 | 0:17:30 | 0:16:30 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200714 | 2019-08-08 23:29:42 | 2019-08-10 17:15:11 | 2019-08-10 17:35:10 | 0:19:59 | 0:12:47 | 0:07:12 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi180 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200716 | 2019-08-08 23:29:42 | 2019-08-10 17:15:11 | 2019-08-10 17:33:10 | 0:17:59 | 0:07:07 | 0:10:52 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi032 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200718 | 2019-08-08 23:29:43 | 2019-08-10 17:16:09 | 2019-08-10 17:44:08 | 0:27:59 | 0:11:58 | 0:16:01 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi077 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200719 | 2019-08-08 23:29:44 | 2019-08-10 17:16:24 | 2019-08-10 17:54:24 | 0:38:00 | 0:12:39 | 0:25:21 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200721 | 2019-08-08 23:29:45 | 2019-08-10 17:16:25 | 2019-08-10 17:56:24 | 0:39:59 | 0:23:48 | 0:16:11 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200723 | 2019-08-08 23:29:46 | 2019-08-10 17:16:47 | 2019-08-10 17:46:46 | 0:29:59 | 0:16:32 | 0:13:27 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200725 | 2019-08-08 23:29:46 | 2019-08-10 17:17:18 | 2019-08-10 18:01:17 | 0:43:59 | 0:23:31 | 0:20:28 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
"2019-08-10T17:51:23.899539+0000 mon.b (mon.0) 1294 : cluster [WRN] Health check failed: 1 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200726 | 2019-08-08 23:29:47 | 2019-08-10 17:17:51 | 2019-08-10 17:39:51 | 0:22:00 | 0:13:37 | 0:08:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi079 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200728 | 2019-08-08 23:29:48 | 2019-08-10 17:18:37 | 2019-08-10 17:44:36 | 0:25:59 | 0:12:40 | 0:13:19 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
pass | 4200730 | 2019-08-08 23:29:49 | 2019-08-10 17:25:50 | 2019-08-10 19:09:50 | 1:44:00 | 1:19:20 | 0:24:40 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200732 | 2019-08-08 23:29:50 | 2019-08-10 17:27:30 | 2019-08-10 18:07:29 | 0:39:59 | 0:16:04 | 0:23:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200734 | 2019-08-08 23:29:50 | 2019-08-10 17:29:10 | 2019-08-10 17:53:09 | 0:23:59 | 0:15:01 | 0:08:58 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi135 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200735 | 2019-08-08 23:29:51 | 2019-08-10 17:29:40 | 2019-08-10 18:01:40 | 0:32:00 | 0:17:45 | 0:14:15 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi076 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200737 | 2019-08-08 23:29:52 | 2019-08-10 17:30:09 | 2019-08-10 18:32:09 | 1:02:00 | 0:49:52 | 0:12:08 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200739 | 2019-08-08 23:29:53 | 2019-08-10 17:30:57 | 2019-08-10 18:26:56 | 0:55:59 | 0:15:21 | 0:40:38 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi152 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200741 | 2019-08-08 23:29:54 | 2019-08-10 17:31:52 | 2019-08-10 17:53:51 | 0:21:59 | 0:15:22 | 0:06:37 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi022 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200742 | 2019-08-08 23:29:55 | 2019-08-10 17:32:46 | 2019-08-10 18:24:45 | 0:51:59 | 0:11:39 | 0:40:20 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200744 | 2019-08-08 23:29:55 | 2019-08-10 17:32:46 | 2019-08-10 18:12:45 | 0:39:59 | 0:06:44 | 0:33:15 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi197 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200746 | 2019-08-08 23:29:56 | 2019-08-10 17:33:12 | 2019-08-10 18:29:11 | 0:55:59 | 0:35:28 | 0:20:31 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200748 | 2019-08-08 23:29:57 | 2019-08-10 17:33:28 | 2019-08-10 18:27:28 | 0:54:00 | 0:06:48 | 0:47:12 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi130 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200750 | 2019-08-08 23:29:58 | 2019-08-10 17:33:32 | 2019-08-10 18:03:32 | 0:30:00 | 0:17:35 | 0:12:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi204 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200752 | 2019-08-08 23:29:59 | 2019-08-10 17:34:10 | 2019-08-10 18:08:10 | 0:34:00 | 0:20:13 | 0:13:47 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200753 | 2019-08-08 23:30:00 | 2019-08-10 17:34:12 | 2019-08-10 18:24:12 | 0:50:00 | 0:23:14 | 0:26:46 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T18:15:08.494856+0000 mon.b (mon.0) 934 : cluster [WRN] Health check failed: 3 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200755 | 2019-08-08 23:30:01 | 2019-08-10 17:34:25 | 2019-08-10 17:52:24 | 0:17:59 | 0:06:55 | 0:11:04 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi203 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200757 | 2019-08-08 23:30:02 | 2019-08-10 17:35:24 | 2019-08-10 17:59:24 | 0:24:00 | 0:17:33 | 0:06:27 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi041 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200759 | 2019-08-08 23:30:02 | 2019-08-10 17:36:11 | 2019-08-10 18:08:10 | 0:31:59 | 0:14:58 | 0:17:01 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200761 | 2019-08-08 23:30:03 | 2019-08-10 17:36:11 | 2019-08-10 18:14:11 | 0:38:00 | 0:15:55 | 0:22:05 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200762 | 2019-08-08 23:30:04 | 2019-08-10 17:38:25 | 2019-08-10 18:06:24 | 0:27:59 | 0:17:17 | 0:10:42 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi187 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200764 | 2019-08-08 23:30:05 | 2019-08-10 17:38:25 | 2019-08-10 18:16:25 | 0:38:00 | 0:23:06 | 0:14:54 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200766 | 2019-08-08 23:30:06 | 2019-08-10 17:40:06 | 2019-08-10 18:30:06 | 0:50:00 | 0:15:51 | 0:34:09 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200768 | 2019-08-08 23:30:07 | 2019-08-10 17:40:07 | 2019-08-10 18:38:06 | 0:57:59 | 0:18:10 | 0:39:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi158 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200770 | 2019-08-08 23:30:08 | 2019-08-10 17:40:47 | 2019-08-10 19:08:47 | 1:28:00 | 0:40:53 | 0:47:07 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4200771 | 2019-08-08 23:30:08 | 2019-08-10 17:40:58 | 2019-08-10 18:44:58 | 1:04:00 | 0:15:58 | 0:48:02 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200773 | 2019-08-08 23:30:09 | 2019-08-10 17:41:50 | 2019-08-10 18:07:49 | 0:25:59 | 0:12:22 | 0:13:37 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4200775 | 2019-08-08 23:30:10 | 2019-08-10 17:42:17 | 2019-08-10 18:10:17 | 0:28:00 | 0:13:39 | 0:14:21 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi170 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200777 | 2019-08-08 23:30:11 | 2019-08-10 17:42:34 | 2019-08-10 18:08:33 | 0:25:59 | 0:12:02 | 0:13:57 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200779 | 2019-08-08 23:30:12 | 2019-08-10 17:44:27 | 2019-08-10 18:16:27 | 0:32:00 | 0:12:42 | 0:19:18 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200780 | 2019-08-08 23:30:13 | 2019-08-10 17:44:38 | 2019-08-10 18:18:37 | 0:33:59 | 0:12:23 | 0:21:36 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200782 | 2019-08-08 23:30:14 | 2019-08-10 17:46:19 | 2019-08-10 19:06:20 | 1:20:01 | 0:12:57 | 1:07:04 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi171 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200784 | 2019-08-08 23:30:14 | 2019-08-10 17:47:01 | 2019-08-10 20:11:02 | 2:24:01 | 2:06:58 | 0:17:03 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200786 | 2019-08-08 23:30:15 | 2019-08-10 17:47:42 | 2019-08-10 18:21:41 | 0:33:59 | 0:12:58 | 0:21:01 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200788 | 2019-08-08 23:30:16 | 2019-08-10 17:48:37 | 2019-08-10 18:30:37 | 0:42:00 | 0:06:53 | 0:35:07 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi091 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200790 | 2019-08-08 23:30:17 | 2019-08-10 17:48:37 | 2019-08-10 18:18:37 | 0:30:00 | 0:13:11 | 0:16:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200791 | 2019-08-08 23:30:18 | 2019-08-10 17:49:27 | 2019-08-10 18:11:26 | 0:21:59 | 0:12:04 | 0:09:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200793 | 2019-08-08 23:30:19 | 2019-08-10 17:52:32 | 2019-08-10 18:30:32 | 0:38:00 | 0:25:12 | 0:12:48 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4200795 | 2019-08-08 23:30:20 | 2019-08-10 17:52:32 | 2019-08-10 18:14:31 | 0:21:59 | 0:06:59 | 0:15:00 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi093 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200797 | 2019-08-08 23:30:20 | 2019-08-10 17:53:10 | 2019-08-10 18:19:10 | 0:26:00 | 0:13:42 | 0:12:18 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200799 | 2019-08-08 23:30:21 | 2019-08-10 17:54:06 | 2019-08-10 18:50:06 | 0:56:00 | 0:13:16 | 0:42:44 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi057 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200800 | 2019-08-08 23:30:22 | 2019-08-10 17:54:25 | 2019-08-10 18:28:24 | 0:33:59 | 0:16:11 | 0:17:48 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200802 | 2019-08-08 23:30:23 | 2019-08-10 17:56:15 | 2019-08-10 18:28:14 | 0:31:59 | 0:15:10 | 0:16:49 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi187 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200804 | 2019-08-08 23:30:24 | 2019-08-10 17:56:26 | 2019-08-10 18:54:26 | 0:58:00 | 0:19:36 | 0:38:24 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
fail | 4200806 | 2019-08-08 23:30:25 | 2019-08-10 17:57:04 | 2019-08-10 18:19:03 | 0:21:59 | 0:12:30 | 0:09:29 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi049 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200808 | 2019-08-08 23:30:26 | 2019-08-10 17:57:07 | 2019-08-10 18:31:06 | 0:33:59 | 0:12:51 | 0:21:08 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200809 | 2019-08-08 23:30:26 | 2019-08-10 17:59:38 | 2019-08-10 18:41:38 | 0:42:00 | 0:23:44 | 0:18:16 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200811 | 2019-08-08 23:30:27 | 2019-08-10 17:59:38 | 2019-08-10 18:41:38 | 0:42:00 | 0:16:35 | 0:25:25 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200813 | 2019-08-08 23:30:28 | 2019-08-10 18:00:51 | 2019-08-10 18:30:51 | 0:30:00 | 0:16:27 | 0:13:33 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi063 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200815 | 2019-08-08 23:30:29 | 2019-08-10 18:01:19 | 2019-08-10 18:39:19 | 0:38:00 | 0:17:53 | 0:20:07 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200817 | 2019-08-08 23:30:30 | 2019-08-10 18:01:41 | 2019-08-10 18:27:41 | 0:26:00 | 0:15:53 | 0:10:07 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi026 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200818 | 2019-08-08 23:30:31 | 2019-08-10 18:02:47 | 2019-08-10 19:32:47 | 1:30:00 | 0:51:35 | 0:38:25 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
pass | 4200820 | 2019-08-08 23:30:32 | 2019-08-10 18:03:33 | 2019-08-10 18:33:32 | 0:29:59 | 0:14:40 | 0:15:19 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
fail | 4200822 | 2019-08-08 23:30:33 | 2019-08-10 18:03:41 | 2019-08-10 18:51:41 | 0:48:00 | 0:13:03 | 0:34:57 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi084 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200824 | 2019-08-08 23:30:34 | 2019-08-10 18:06:26 | 2019-08-10 18:30:25 | 0:23:59 | 0:15:36 | 0:08:23 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi182 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200826 | 2019-08-08 23:30:35 | 2019-08-10 18:06:43 | 2019-08-10 19:10:43 | 1:04:00 | 0:51:49 | 0:12:11 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4200827 | 2019-08-08 23:30:35 | 2019-08-10 18:07:17 | 2019-08-10 18:27:16 | 0:19:59 | 0:07:01 | 0:12:58 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi169 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200829 | 2019-08-08 23:30:37 | 2019-08-10 18:07:44 | 2019-08-10 18:33:44 | 0:26:00 | 0:16:57 | 0:09:03 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi120 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200831 | 2019-08-08 23:30:37 | 2019-08-10 18:07:51 | 2019-08-10 18:49:50 | 0:41:59 | 0:11:31 | 0:30:28 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200833 | 2019-08-08 23:30:38 | 2019-08-10 18:08:11 | 2019-08-10 18:44:11 | 0:36:00 | 0:06:51 | 0:29:09 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi170 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200835 | 2019-08-08 23:30:39 | 2019-08-10 18:08:12 | 2019-08-10 19:42:12 | 1:34:00 | 0:31:32 | 1:02:28 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
pass | 4200836 | 2019-08-08 23:30:40 | 2019-08-10 18:08:35 | 2019-08-10 18:32:33 | 0:23:58 | 0:12:19 | 0:11:39 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4200838 | 2019-08-08 23:30:41 | 2019-08-10 18:09:43 | 2019-08-10 18:41:42 | 0:31:59 | 0:17:23 | 0:14:36 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200840 | 2019-08-08 23:30:42 | 2019-08-10 18:10:18 | 2019-08-10 18:40:18 | 0:30:00 | 0:18:22 | 0:11:38 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi197 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200842 | 2019-08-08 23:30:43 | 2019-08-10 18:11:28 | 2019-08-10 18:59:27 | 0:47:59 | 0:25:14 | 0:22:45 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T18:48:37.603857+0000 mon.b (mon.0) 923 : cluster [WRN] Health check failed: 7 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4200844 | 2019-08-08 23:30:44 | 2019-08-10 18:12:23 | 2019-08-10 19:10:23 | 0:58:00 | 0:34:39 | 0:23:21 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4200845 | 2019-08-08 23:30:45 | 2019-08-10 18:12:31 | 2019-08-10 18:42:31 | 0:30:00 | 0:17:24 | 0:12:36 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi131 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200847 | 2019-08-08 23:30:45 | 2019-08-10 18:12:47 | 2019-08-10 18:40:46 | 0:27:59 | 0:17:53 | 0:10:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi003 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200849 | 2019-08-08 23:30:46 | 2019-08-10 18:14:26 | 2019-08-10 18:48:25 | 0:33:59 | 0:14:25 | 0:19:34 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4200851 | 2019-08-08 23:30:47 | 2019-08-10 18:14:33 | 2019-08-10 18:58:32 | 0:43:59 | 0:06:54 | 0:37:05 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200853 | 2019-08-08 23:30:48 | 2019-08-10 18:16:11 | 2019-08-10 19:26:11 | 1:10:00 | 0:23:39 | 0:46:21 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200854 | 2019-08-08 23:30:49 | 2019-08-10 18:16:26 | 2019-08-10 18:48:25 | 0:31:59 | 0:17:14 | 0:14:45 | smithi | master | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200856 | 2019-08-08 23:30:50 | 2019-08-10 18:16:28 | 2019-08-10 18:46:27 | 0:29:59 | 0:13:32 | 0:16:27 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi076 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200858 | 2019-08-08 23:30:50 | 2019-08-10 18:18:51 | 2019-08-10 19:38:52 | 1:20:01 | 0:38:32 | 0:41:29 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4200860 | 2019-08-08 23:30:51 | 2019-08-10 18:18:52 | 2019-08-10 18:50:51 | 0:31:59 | 0:06:54 | 0:25:05 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi017 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200861 | 2019-08-08 23:30:52 | 2019-08-10 18:19:05 | 2019-08-10 18:49:04 | 0:29:59 | 0:13:48 | 0:16:11 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4200863 | 2019-08-08 23:30:53 | 2019-08-10 18:19:11 | 2019-08-10 19:09:11 | 0:50:00 | 0:13:15 | 0:36:45 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi086 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200865 | 2019-08-08 23:30:54 | 2019-08-10 18:21:57 | 2019-08-10 18:49:56 | 0:27:59 | 0:11:40 | 0:16:19 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200867 | 2019-08-08 23:30:55 | 2019-08-10 18:24:28 | 2019-08-10 18:52:27 | 0:27:59 | 0:13:16 | 0:14:43 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi117 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200869 | 2019-08-08 23:30:55 | 2019-08-10 18:24:47 | 2019-08-10 18:48:46 | 0:23:59 | 0:13:40 | 0:10:19 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200870 | 2019-08-08 23:30:56 | 2019-08-10 18:27:11 | 2019-08-10 18:51:10 | 0:23:59 | 0:13:01 | 0:10:58 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi103 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200872 | 2019-08-08 23:30:57 | 2019-08-10 18:27:18 | 2019-08-10 20:47:19 | 2:20:01 | 2:07:22 | 0:12:39 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200874 | 2019-08-08 23:30:58 | 2019-08-10 18:27:29 | 2019-08-10 18:53:29 | 0:26:00 | 0:12:08 | 0:13:52 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200876 | 2019-08-08 23:30:59 | 2019-08-10 18:27:42 | 2019-08-10 19:07:42 | 0:40:00 | 0:06:52 | 0:33:08 | smithi | master | centos | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi037 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200878 | 2019-08-08 23:31:00 | 2019-08-10 18:28:31 | 2019-08-10 19:12:30 | 0:43:59 | 0:31:50 | 0:12:09 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4200879 | 2019-08-08 23:31:01 | 2019-08-10 18:28:31 | 2019-08-10 18:50:30 | 0:21:59 | 0:13:09 | 0:08:50 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi063 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200881 | 2019-08-08 23:31:01 | 2019-08-10 18:29:13 | 2019-08-10 19:39:13 | 1:10:00 | 0:20:24 | 0:49:36 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4200883 | 2019-08-08 23:31:02 | 2019-08-10 18:30:22 | 2019-08-10 19:16:21 | 0:45:59 | 0:27:18 | 0:18:41 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
"2019-08-10T19:00:01.129918+0000 mon.b (mon.0) 236 : cluster [WRN] Health check failed: 1 MDSs report slow metadata IOs (MDS_SLOW_METADATA_IO)" in cluster log |
||||||||||||||
fail | 4200885 | 2019-08-08 23:31:03 | 2019-08-10 18:30:27 | 2019-08-10 19:04:26 | 0:33:59 | 0:06:47 | 0:27:12 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200887 | 2019-08-08 23:31:04 | 2019-08-10 18:30:33 | 2019-08-10 19:00:32 | 0:29:59 | 0:11:40 | 0:18:19 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi106 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200888 | 2019-08-08 23:31:05 | 2019-08-10 18:30:38 | 2019-08-10 19:16:38 | 0:46:00 | 0:17:18 | 0:28:42 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200890 | 2019-08-08 23:31:05 | 2019-08-10 18:30:52 | 2019-08-10 19:08:51 | 0:37:59 | 0:13:10 | 0:24:49 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi080 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200892 | 2019-08-08 23:31:06 | 2019-08-10 18:31:07 | 2019-08-10 19:01:07 | 0:30:00 | 0:12:55 | 0:17:05 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200894 | 2019-08-08 23:31:07 | 2019-08-10 18:31:46 | 2019-08-10 18:55:46 | 0:24:00 | 0:11:16 | 0:12:44 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200896 | 2019-08-08 23:31:08 | 2019-08-10 18:32:10 | 2019-08-10 19:04:10 | 0:32:00 | 0:13:15 | 0:18:45 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200897 | 2019-08-08 23:31:09 | 2019-08-10 18:32:35 | 2019-08-10 19:10:34 | 0:37:59 | 0:16:08 | 0:21:51 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200899 | 2019-08-08 23:31:10 | 2019-08-10 18:33:48 | 2019-08-10 19:39:48 | 1:06:00 | 0:50:50 | 0:15:10 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200901 | 2019-08-08 23:31:10 | 2019-08-10 18:33:48 | 2019-08-10 19:15:47 | 0:41:59 | 0:11:21 | 0:30:38 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi143 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200903 | 2019-08-08 23:31:11 | 2019-08-10 18:38:23 | 2019-08-10 19:10:22 | 0:31:59 | 0:13:07 | 0:18:52 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200905 | 2019-08-08 23:31:12 | 2019-08-10 18:39:20 | 2019-08-10 19:03:19 | 0:23:59 | 0:12:57 | 0:11:02 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
pass | 4200906 | 2019-08-08 23:31:13 | 2019-08-10 18:39:58 | 2019-08-10 19:45:58 | 1:06:00 | 0:54:56 | 0:11:04 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200908 | 2019-08-08 23:31:14 | 2019-08-10 18:40:19 | 2019-08-10 19:24:19 | 0:44:00 | 0:11:55 | 0:32:05 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200910 | 2019-08-08 23:31:14 | 2019-08-10 18:40:48 | 2019-08-10 19:08:47 | 0:27:59 | 0:13:12 | 0:14:47 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi017 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200912 | 2019-08-08 23:31:15 | 2019-08-10 18:41:40 | 2019-08-10 18:59:39 | 0:17:59 | 0:11:51 | 0:06:08 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi203 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200914 | 2019-08-08 23:31:16 | 2019-08-10 18:41:40 | 2019-08-10 20:59:41 | 2:18:01 | 0:50:25 | 1:27:36 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
pass | 4200915 | 2019-08-08 23:31:17 | 2019-08-10 18:41:43 | 2019-08-10 19:31:43 | 0:50:00 | 0:30:40 | 0:19:20 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
fail | 4200917 | 2019-08-08 23:31:18 | 2019-08-10 18:42:32 | 2019-08-10 19:08:32 | 0:26:00 | 0:13:17 | 0:12:43 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200919 | 2019-08-08 23:31:19 | 2019-08-10 18:44:28 | 2019-08-10 19:10:26 | 0:25:58 | 0:12:07 | 0:13:51 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200921 | 2019-08-08 23:31:19 | 2019-08-10 18:45:00 | 2019-08-10 19:24:59 | 0:39:59 | 0:07:08 | 0:32:51 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi181 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200923 | 2019-08-08 23:31:20 | 2019-08-10 18:46:43 | 2019-08-10 19:22:43 | 0:36:00 | 0:23:48 | 0:12:12 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4200924 | 2019-08-08 23:31:21 | 2019-08-10 18:48:40 | 2019-08-10 20:02:40 | 1:14:00 | 0:07:05 | 1:06:55 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi043 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200926 | 2019-08-08 23:31:22 | 2019-08-10 18:48:41 | 2019-08-10 20:08:41 | 1:20:00 | 0:14:45 | 1:05:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi053 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200928 | 2019-08-08 23:31:23 | 2019-08-10 18:48:47 | 2019-08-10 19:12:47 | 0:24:00 | 0:14:37 | 0:09:23 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200930 | 2019-08-08 23:31:24 | 2019-08-10 18:49:05 | 2019-08-10 19:31:04 | 0:41:59 | 0:26:50 | 0:15:09 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T19:17:21.471336+0000 mon.b (mon.0) 859 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4200932 | 2019-08-08 23:31:24 | 2019-08-10 18:49:06 | 2019-08-10 19:07:05 | 0:17:59 | 0:07:02 | 0:10:57 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi181 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200933 | 2019-08-08 23:31:25 | 2019-08-10 18:49:31 | 2019-08-10 20:07:31 | 1:18:00 | 0:13:47 | 1:04:13 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200935 | 2019-08-08 23:31:26 | 2019-08-10 18:50:07 | 2019-08-10 19:30:06 | 0:39:59 | 0:12:32 | 0:27:27 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi179 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200937 | 2019-08-08 23:31:27 | 2019-08-10 18:50:07 | 2019-08-10 19:52:06 | 1:01:59 | 0:15:42 | 0:46:17 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
dead | 4200939 | 2019-08-08 23:31:28 | 2019-08-10 18:50:08 | 2019-08-11 06:52:35 | 12:02:27 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4200940 | 2019-08-08 23:31:29 | 2019-08-10 18:50:24 | 2019-08-10 19:32:24 | 0:42:00 | 0:16:21 | 0:25:39 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200942 | 2019-08-08 23:31:30 | 2019-08-10 18:50:32 | 2019-08-10 20:08:32 | 1:18:00 | 0:50:01 | 0:27:59 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200944 | 2019-08-08 23:31:30 | 2019-08-10 18:50:52 | 2019-08-10 19:12:52 | 0:22:00 | 0:11:19 | 0:10:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200946 | 2019-08-08 23:31:31 | 2019-08-10 18:51:26 | 2019-08-10 20:15:27 | 1:24:01 | 0:49:01 | 0:35:00 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4200948 | 2019-08-08 23:31:32 | 2019-08-10 18:51:43 | 2019-08-10 19:23:42 | 0:31:59 | 0:15:07 | 0:16:52 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4200950 | 2019-08-08 23:31:33 | 2019-08-10 18:52:42 | 2019-08-10 19:14:41 | 0:21:59 | 0:06:59 | 0:15:00 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200951 | 2019-08-08 23:31:34 | 2019-08-10 18:53:30 | 2019-08-10 19:13:29 | 0:19:59 | 0:11:02 | 0:08:57 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi009 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200953 | 2019-08-08 23:31:34 | 2019-08-10 18:54:41 | 2019-08-10 19:20:40 | 0:25:59 | 0:11:36 | 0:14:23 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200955 | 2019-08-08 23:31:35 | 2019-08-10 18:56:01 | 2019-08-10 19:36:00 | 0:39:59 | 0:13:10 | 0:26:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi094 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200957 | 2019-08-08 23:31:36 | 2019-08-10 18:57:45 | 2019-08-10 19:19:44 | 0:21:59 | 0:10:27 | 0:11:32 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi087 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200958 | 2019-08-08 23:31:37 | 2019-08-10 18:58:34 | 2019-08-10 19:26:33 | 0:27:59 | 0:12:04 | 0:15:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi086 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200960 | 2019-08-08 23:31:38 | 2019-08-10 18:59:42 | 2019-08-10 21:11:44 | 2:12:02 | 1:58:03 | 0:13:59 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4200962 | 2019-08-08 23:31:39 | 2019-08-10 18:59:43 | 2019-08-10 19:25:42 | 0:25:59 | 0:12:41 | 0:13:18 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4200964 | 2019-08-08 23:31:39 | 2019-08-10 19:00:34 | 2019-08-10 19:24:33 | 0:23:59 | 0:06:49 | 0:17:10 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200966 | 2019-08-08 23:31:40 | 2019-08-10 19:01:16 | 2019-08-10 19:37:16 | 0:36:00 | 0:13:07 | 0:22:53 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi072 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200967 | 2019-08-08 23:31:41 | 2019-08-10 19:03:36 | 2019-08-10 19:27:35 | 0:23:59 | 0:12:15 | 0:11:44 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi196 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200969 | 2019-08-08 23:31:42 | 2019-08-10 19:04:11 | 2019-08-10 19:40:11 | 0:36:00 | 0:21:17 | 0:14:43 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4200971 | 2019-08-08 23:31:43 | 2019-08-10 19:04:28 | 2019-08-10 19:40:27 | 0:35:59 | 0:06:50 | 0:29:09 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi144 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200973 | 2019-08-08 23:31:43 | 2019-08-10 19:06:37 | 2019-08-10 19:32:36 | 0:25:59 | 0:13:08 | 0:12:51 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi091 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200974 | 2019-08-08 23:31:44 | 2019-08-10 19:07:06 | 2019-08-10 19:27:05 | 0:19:59 | 0:11:59 | 0:08:00 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi080 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200976 | 2019-08-08 23:31:45 | 2019-08-10 19:07:54 | 2019-08-10 19:37:54 | 0:30:00 | 0:16:14 | 0:13:46 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4200978 | 2019-08-08 23:31:46 | 2019-08-10 19:08:33 | 2019-08-10 20:26:33 | 1:18:00 | 0:13:29 | 1:04:31 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi078 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200980 | 2019-08-08 23:31:47 | 2019-08-10 19:08:49 | 2019-08-10 19:40:48 | 0:31:59 | 0:06:51 | 0:25:08 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200982 | 2019-08-08 23:31:48 | 2019-08-10 19:08:49 | 2019-08-10 19:46:48 | 0:37:59 | 0:11:47 | 0:26:12 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi012 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200984 | 2019-08-08 23:31:48 | 2019-08-10 19:08:53 | 2019-08-10 20:12:53 | 1:04:00 | 0:12:49 | 0:51:11 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4200985 | 2019-08-08 23:31:49 | 2019-08-10 19:09:26 | 2019-08-10 19:37:25 | 0:27:59 | 0:15:23 | 0:12:36 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4200987 | 2019-08-08 23:31:50 | 2019-08-10 19:09:52 | 2019-08-10 20:15:52 | 1:06:00 | 0:49:37 | 0:16:23 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4200989 | 2019-08-08 23:31:51 | 2019-08-10 19:10:24 | 2019-08-10 20:04:24 | 0:54:00 | 0:17:38 | 0:36:22 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200991 | 2019-08-08 23:31:52 | 2019-08-10 19:10:24 | 2019-08-10 20:14:24 | 1:04:00 | 0:11:05 | 0:52:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi077 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200992 | 2019-08-08 23:31:52 | 2019-08-10 19:10:41 | 2019-08-10 20:04:41 | 0:54:00 | 0:17:51 | 0:36:09 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200993 | 2019-08-08 23:31:53 | 2019-08-10 19:10:41 | 2019-08-10 20:42:42 | 1:32:01 | 1:01:18 | 0:30:43 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4200994 | 2019-08-08 23:31:54 | 2019-08-10 19:10:44 | 2019-08-10 20:28:44 | 1:18:00 | 0:10:52 | 1:07:08 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi026 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200995 | 2019-08-08 23:31:55 | 2019-08-10 19:12:45 | 2019-08-10 19:40:44 | 0:27:59 | 0:12:09 | 0:15:50 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi115 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4200996 | 2019-08-08 23:31:56 | 2019-08-10 19:12:48 | 2019-08-10 19:34:47 | 0:21:59 | 0:12:29 | 0:09:30 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4200997 | 2019-08-08 23:31:56 | 2019-08-10 19:12:53 | 2019-08-10 20:12:53 | 1:00:00 | 0:49:45 | 0:10:15 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
pass | 4200998 | 2019-08-08 23:31:57 | 2019-08-10 19:13:31 | 2019-08-10 19:59:31 | 0:46:00 | 0:25:03 | 0:20:57 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
fail | 4200999 | 2019-08-08 23:31:58 | 2019-08-10 19:14:57 | 2019-08-10 19:44:56 | 0:29:59 | 0:11:53 | 0:18:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201000 | 2019-08-08 23:31:59 | 2019-08-10 19:15:49 | 2019-08-10 19:41:48 | 0:25:59 | 0:12:23 | 0:13:36 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201001 | 2019-08-08 23:32:00 | 2019-08-10 19:16:36 | 2019-08-10 19:36:36 | 0:20:00 | 0:06:48 | 0:13:12 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi041 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201002 | 2019-08-08 23:32:01 | 2019-08-10 19:16:40 | 2019-08-10 20:20:39 | 1:03:59 | 0:34:53 | 0:29:06 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4201003 | 2019-08-08 23:32:01 | 2019-08-10 19:19:59 | 2019-08-10 19:43:58 | 0:23:59 | 0:13:10 | 0:10:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi169 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201004 | 2019-08-08 23:32:02 | 2019-08-10 19:20:42 | 2019-08-10 19:40:41 | 0:19:59 | 0:12:05 | 0:07:54 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201005 | 2019-08-08 23:32:03 | 2019-08-10 19:22:59 | 2019-08-10 19:46:58 | 0:23:59 | 0:12:56 | 0:11:03 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201006 | 2019-08-08 23:32:04 | 2019-08-10 19:23:03 | 2019-08-10 20:05:03 | 0:42:00 | 0:22:59 | 0:19:01 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T19:56:14.310626+0000 mon.b (mon.0) 1002 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4201007 | 2019-08-08 23:32:05 | 2019-08-10 19:23:44 | 2019-08-10 20:07:43 | 0:43:59 | 0:10:35 | 0:33:24 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi114 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201008 | 2019-08-08 23:32:06 | 2019-08-10 19:24:33 | 2019-08-10 19:56:33 | 0:32:00 | 0:14:11 | 0:17:49 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi075 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201009 | 2019-08-08 23:32:06 | 2019-08-10 19:24:35 | 2019-08-10 20:24:35 | 1:00:00 | 0:14:14 | 0:45:46 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201010 | 2019-08-08 23:32:07 | 2019-08-10 19:25:01 | 2019-08-10 19:53:00 | 0:27:59 | 0:17:10 | 0:10:49 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
pass | 4201011 | 2019-08-08 23:32:08 | 2019-08-10 19:25:57 | 2019-08-10 19:59:56 | 0:33:59 | 0:23:25 | 0:10:34 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
pass | 4201012 | 2019-08-08 23:32:09 | 2019-08-10 19:26:13 | 2019-08-10 20:00:12 | 0:33:59 | 0:16:24 | 0:17:35 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201013 | 2019-08-08 23:32:10 | 2019-08-10 19:26:35 | 2019-08-10 20:40:35 | 1:14:00 | 0:51:00 | 0:23:00 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201014 | 2019-08-08 23:32:10 | 2019-08-10 19:27:21 | 2019-08-10 19:47:20 | 0:19:59 | 0:11:18 | 0:08:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi102 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201015 | 2019-08-08 23:32:11 | 2019-08-10 19:27:36 | 2019-08-10 20:37:36 | 1:10:00 | 0:43:42 | 0:26:18 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4201016 | 2019-08-08 23:32:12 | 2019-08-10 19:30:21 | 2019-08-10 19:48:20 | 0:17:59 | 0:07:02 | 0:10:57 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi057 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201017 | 2019-08-08 23:32:13 | 2019-08-10 19:31:05 | 2019-08-10 20:31:05 | 1:00:00 | 0:12:54 | 0:47:06 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4201018 | 2019-08-08 23:32:14 | 2019-08-10 19:31:58 | 2019-08-10 20:03:57 | 0:31:59 | 0:16:56 | 0:15:03 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi181 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201019 | 2019-08-08 23:32:14 | 2019-08-10 19:32:25 | 2019-08-10 20:02:24 | 0:29:59 | 0:11:50 | 0:18:09 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
pass | 4201020 | 2019-08-08 23:32:15 | 2019-08-10 19:32:37 | 2019-08-10 19:58:36 | 0:25:59 | 0:13:36 | 0:12:23 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
fail | 4201021 | 2019-08-08 23:32:16 | 2019-08-10 19:32:49 | 2019-08-10 20:08:48 | 0:35:59 | 0:14:31 | 0:21:28 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi102 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201022 | 2019-08-08 23:32:17 | 2019-08-10 19:34:58 | 2019-08-10 20:02:57 | 0:27:59 | 0:16:28 | 0:11:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi099 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201023 | 2019-08-08 23:32:18 | 2019-08-10 19:35:02 | 2019-08-10 21:47:04 | 2:12:02 | 2:00:16 | 0:11:46 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201024 | 2019-08-08 23:32:19 | 2019-08-10 19:36:17 | 2019-08-10 20:26:16 | 0:49:59 | 0:11:45 | 0:38:14 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201025 | 2019-08-08 23:32:19 | 2019-08-10 19:36:38 | 2019-08-10 20:20:37 | 0:43:59 | 0:10:46 | 0:33:13 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201026 | 2019-08-08 23:32:20 | 2019-08-10 19:37:31 | 2019-08-10 19:57:30 | 0:19:59 | 0:07:00 | 0:12:59 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi017 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201027 | 2019-08-08 23:32:21 | 2019-08-10 19:37:31 | 2019-08-10 20:01:30 | 0:23:59 | 0:16:22 | 0:07:37 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi180 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201028 | 2019-08-08 23:32:22 | 2019-08-10 19:37:55 | 2019-08-10 20:13:55 | 0:36:00 | 0:24:57 | 0:11:03 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4201029 | 2019-08-08 23:32:23 | 2019-08-10 19:39:06 | 2019-08-10 20:17:06 | 0:38:00 | 0:14:17 | 0:23:43 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
pass | 4201030 | 2019-08-08 23:32:23 | 2019-08-10 19:39:14 | 2019-08-10 20:39:14 | 1:00:00 | 0:50:25 | 0:09:35 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4201031 | 2019-08-08 23:32:24 | 2019-08-10 19:39:49 | 2019-08-10 20:49:49 | 1:10:00 | 0:10:55 | 0:59:05 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201032 | 2019-08-08 23:32:25 | 2019-08-10 19:40:25 | 2019-08-10 20:08:24 | 0:27:59 | 0:17:00 | 0:10:59 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201033 | 2019-08-08 23:32:26 | 2019-08-10 19:40:28 | 2019-08-10 20:16:28 | 0:36:00 | 0:11:27 | 0:24:33 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi145 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201034 | 2019-08-08 23:32:27 | 2019-08-10 19:40:42 | 2019-08-10 20:08:41 | 0:27:59 | 0:14:50 | 0:13:09 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi033 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201035 | 2019-08-08 23:32:28 | 2019-08-10 19:40:46 | 2019-08-10 20:04:45 | 0:23:59 | 0:16:41 | 0:07:18 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi130 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201036 | 2019-08-08 23:32:29 | 2019-08-10 19:40:49 | 2019-08-10 20:36:49 | 0:56:00 | 0:12:59 | 0:43:01 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201037 | 2019-08-08 23:32:30 | 2019-08-10 19:42:04 | 2019-08-10 20:32:04 | 0:50:00 | 0:15:35 | 0:34:25 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201038 | 2019-08-08 23:32:30 | 2019-08-10 19:42:14 | 2019-08-10 21:02:14 | 1:20:00 | 0:52:41 | 0:27:19 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201039 | 2019-08-08 23:32:31 | 2019-08-10 19:44:14 | 2019-08-10 20:24:14 | 0:40:00 | 0:14:56 | 0:25:04 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi041 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201040 | 2019-08-08 23:32:32 | 2019-08-10 19:44:58 | 2019-08-10 20:28:57 | 0:43:59 | 0:12:48 | 0:31:11 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201041 | 2019-08-08 23:32:33 | 2019-08-10 19:46:13 | 2019-08-10 20:44:13 | 0:58:00 | 0:11:23 | 0:46:37 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201042 | 2019-08-08 23:32:34 | 2019-08-10 19:46:50 | 2019-08-10 21:06:50 | 1:20:00 | 0:53:14 | 0:26:46 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4201043 | 2019-08-08 23:32:35 | 2019-08-10 19:47:00 | 2019-08-10 20:22:59 | 0:35:59 | 0:10:40 | 0:25:19 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi195 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201044 | 2019-08-08 23:32:36 | 2019-08-10 19:47:35 | 2019-08-10 20:13:35 | 0:26:00 | 0:11:17 | 0:14:43 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi017 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201045 | 2019-08-08 23:32:37 | 2019-08-10 19:48:21 | 2019-08-10 20:22:21 | 0:34:00 | 0:13:41 | 0:20:19 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201046 | 2019-08-08 23:32:37 | 2019-08-10 19:52:21 | 2019-08-10 21:22:22 | 1:30:01 | 0:50:00 | 0:40:01 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4201047 | 2019-08-08 23:32:38 | 2019-08-10 19:53:02 | 2019-08-10 20:45:01 | 0:51:59 | 0:06:56 | 0:45:03 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201048 | 2019-08-08 23:32:39 | 2019-08-10 19:56:48 | 2019-08-10 20:24:47 | 0:27:59 | 0:14:22 | 0:13:37 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi089 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201049 | 2019-08-08 23:32:40 | 2019-08-10 19:57:32 | 2019-08-10 20:37:31 | 0:39:59 | 0:11:23 | 0:28:36 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201050 | 2019-08-08 23:32:41 | 2019-08-10 19:58:52 | 2019-08-10 20:30:51 | 0:31:59 | 0:06:47 | 0:25:12 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi047 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201051 | 2019-08-08 23:32:42 | 2019-08-10 19:59:32 | 2019-08-10 20:47:32 | 0:48:00 | 0:24:49 | 0:23:11 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4201052 | 2019-08-08 23:32:43 | 2019-08-10 20:00:12 | 2019-08-10 20:24:11 | 0:23:59 | 0:06:50 | 0:17:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi058 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201053 | 2019-08-08 23:32:44 | 2019-08-10 20:00:14 | 2019-08-10 20:32:13 | 0:31:59 | 0:11:20 | 0:20:39 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi137 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201054 | 2019-08-08 23:32:45 | 2019-08-10 20:01:45 | 2019-08-10 20:41:45 | 0:40:00 | 0:07:00 | 0:33:00 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi087 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201055 | 2019-08-08 23:32:46 | 2019-08-10 20:02:26 | 2019-08-10 20:42:25 | 0:39:59 | 0:27:09 | 0:12:50 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T20:27:57.531610+0000 mon.a (mon.0) 717 : cluster [WRN] Health check failed: 2 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4201056 | 2019-08-08 23:32:47 | 2019-08-10 20:02:42 | 2019-08-10 21:36:43 | 1:34:01 | 0:35:24 | 0:58:37 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4201057 | 2019-08-08 23:32:48 | 2019-08-10 20:03:11 | 2019-08-10 20:23:11 | 0:20:00 | 0:14:03 | 0:05:57 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi181 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201058 | 2019-08-08 23:32:49 | 2019-08-10 20:03:31 | 2019-08-10 20:25:30 | 0:21:59 | 0:14:21 | 0:07:38 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201059 | 2019-08-08 23:32:50 | 2019-08-10 20:03:59 | 2019-08-10 20:33:58 | 0:29:59 | 0:15:38 | 0:14:21 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4201060 | 2019-08-08 23:32:50 | 2019-08-10 20:04:39 | 2019-08-10 20:32:38 | 0:27:59 | 0:12:37 | 0:15:22 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201061 | 2019-08-08 23:32:51 | 2019-08-10 20:04:42 | 2019-08-10 21:00:42 | 0:56:00 | 0:19:49 | 0:36:11 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201062 | 2019-08-08 23:32:52 | 2019-08-10 20:04:47 | 2019-08-10 20:34:46 | 0:29:59 | 0:16:19 | 0:13:40 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201063 | 2019-08-08 23:32:53 | 2019-08-10 20:05:05 | 2019-08-10 20:33:04 | 0:27:59 | 0:11:10 | 0:16:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi059 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201064 | 2019-08-08 23:32:54 | 2019-08-10 20:07:46 | 2019-08-10 20:57:46 | 0:50:00 | 0:32:55 | 0:17:05 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4201065 | 2019-08-08 23:32:55 | 2019-08-10 20:07:46 | 2019-08-10 21:31:46 | 1:24:00 | 0:14:44 | 1:09:16 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi102 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201066 | 2019-08-08 23:32:56 | 2019-08-10 20:08:26 | 2019-08-10 20:44:25 | 0:35:59 | 0:12:59 | 0:23:00 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4201067 | 2019-08-08 23:32:56 | 2019-08-10 20:08:34 | 2019-08-10 20:54:33 | 0:45:59 | 0:11:19 | 0:34:40 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi194 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201068 | 2019-08-08 23:32:57 | 2019-08-10 20:08:42 | 2019-08-10 20:34:41 | 0:25:59 | 0:11:15 | 0:14:44 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
pass | 4201069 | 2019-08-08 23:32:58 | 2019-08-10 20:08:43 | 2019-08-10 20:42:42 | 0:33:59 | 0:13:51 | 0:20:08 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
fail | 4201070 | 2019-08-08 23:32:59 | 2019-08-10 20:09:05 | 2019-08-10 21:11:04 | 1:01:59 | 0:11:07 | 0:50:52 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi132 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201071 | 2019-08-08 23:33:00 | 2019-08-10 20:11:19 | 2019-08-10 20:31:18 | 0:19:59 | 0:12:14 | 0:07:45 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi146 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201072 | 2019-08-08 23:33:01 | 2019-08-10 20:12:55 | 2019-08-10 22:46:56 | 2:34:01 | 1:58:00 | 0:36:01 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201073 | 2019-08-08 23:33:01 | 2019-08-10 20:12:55 | 2019-08-10 20:46:54 | 0:33:59 | 0:11:48 | 0:22:11 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201074 | 2019-08-08 23:33:02 | 2019-08-10 20:13:36 | 2019-08-10 20:39:36 | 0:26:00 | 0:07:04 | 0:18:56 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201075 | 2019-08-08 23:33:03 | 2019-08-10 20:14:10 | 2019-08-10 21:18:10 | 1:04:00 | 0:38:37 | 0:25:23 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4201076 | 2019-08-08 23:33:04 | 2019-08-10 20:14:26 | 2019-08-10 21:22:26 | 1:08:00 | 0:14:58 | 0:53:02 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi166 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201077 | 2019-08-08 23:33:05 | 2019-08-10 20:15:42 | 2019-08-10 20:55:41 | 0:39:59 | 0:21:23 | 0:18:36 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4201078 | 2019-08-08 23:33:05 | 2019-08-10 20:15:54 | 2019-08-10 20:53:53 | 0:37:59 | 0:19:27 | 0:18:32 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4201079 | 2019-08-08 23:33:06 | 2019-08-10 20:16:30 | 2019-08-10 20:38:29 | 0:21:59 | 0:06:53 | 0:15:06 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201080 | 2019-08-08 23:33:07 | 2019-08-10 20:17:10 | 2019-08-10 20:59:10 | 0:42:00 | 0:11:37 | 0:30:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi170 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201081 | 2019-08-08 23:33:08 | 2019-08-10 20:20:52 | 2019-08-10 21:00:51 | 0:39:59 | 0:18:18 | 0:21:41 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201082 | 2019-08-08 23:33:09 | 2019-08-10 20:20:52 | 2019-08-10 21:16:52 | 0:56:00 | 0:12:43 | 0:43:17 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi203 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201083 | 2019-08-08 23:33:10 | 2019-08-10 20:22:36 | 2019-08-10 21:06:35 | 0:43:59 | 0:18:55 | 0:25:04 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
fail | 4201084 | 2019-08-08 23:33:10 | 2019-08-10 20:23:00 | 2019-08-10 20:42:59 | 0:19:59 | 0:10:28 | 0:09:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi029 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201085 | 2019-08-08 23:33:11 | 2019-08-10 20:23:12 | 2019-08-10 20:55:11 | 0:31:59 | 0:12:45 | 0:19:14 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201086 | 2019-08-08 23:33:12 | 2019-08-10 20:24:27 | 2019-08-10 21:30:27 | 1:06:00 | 0:20:04 | 0:45:56 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201087 | 2019-08-08 23:33:13 | 2019-08-10 20:24:27 | 2019-08-10 21:08:26 | 0:43:59 | 0:16:26 | 0:27:33 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201088 | 2019-08-08 23:33:14 | 2019-08-10 20:24:36 | 2019-08-10 20:46:35 | 0:21:59 | 0:07:05 | 0:14:54 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi167 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201089 | 2019-08-08 23:33:14 | 2019-08-10 20:24:48 | 2019-08-10 20:50:48 | 0:26:00 | 0:11:09 | 0:14:51 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi009 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201090 | 2019-08-08 23:33:15 | 2019-08-10 20:25:46 | 2019-08-10 21:03:45 | 0:37:59 | 0:10:47 | 0:27:12 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201091 | 2019-08-08 23:33:16 | 2019-08-10 20:26:18 | 2019-08-10 21:42:18 | 1:16:00 | 1:04:10 | 0:11:50 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
pass | 4201092 | 2019-08-08 23:33:17 | 2019-08-10 20:26:35 | 2019-08-10 20:52:34 | 0:25:59 | 0:14:12 | 0:11:47 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
fail | 4201093 | 2019-08-08 23:33:18 | 2019-08-10 20:28:59 | 2019-08-10 21:02:58 | 0:33:59 | 0:11:37 | 0:22:22 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201094 | 2019-08-08 23:33:19 | 2019-08-10 20:28:59 | 2019-08-10 21:02:58 | 0:33:59 | 0:11:43 | 0:22:16 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201095 | 2019-08-08 23:33:19 | 2019-08-10 20:30:52 | 2019-08-10 21:32:52 | 1:02:00 | 0:50:13 | 0:11:47 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4201096 | 2019-08-08 23:33:20 | 2019-08-10 20:31:07 | 2019-08-10 21:27:07 | 0:56:00 | 0:16:51 | 0:39:09 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi023 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201097 | 2019-08-08 23:33:21 | 2019-08-10 20:31:20 | 2019-08-10 20:51:19 | 0:19:59 | 0:11:07 | 0:08:52 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi102 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201098 | 2019-08-08 23:33:22 | 2019-08-10 20:32:19 | 2019-08-10 21:04:18 | 0:31:59 | 0:10:51 | 0:21:08 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201099 | 2019-08-08 23:33:23 | 2019-08-10 20:32:19 | 2019-08-10 20:52:18 | 0:19:59 | 0:06:45 | 0:13:14 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi103 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201100 | 2019-08-08 23:33:24 | 2019-08-10 20:32:40 | 2019-08-10 21:28:40 | 0:56:00 | 0:34:24 | 0:21:36 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
pass | 4201101 | 2019-08-08 23:33:25 | 2019-08-10 20:33:05 | 2019-08-10 20:55:05 | 0:22:00 | 0:12:11 | 0:09:49 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4201102 | 2019-08-08 23:33:26 | 2019-08-10 20:34:16 | 2019-08-10 20:58:15 | 0:23:59 | 0:11:20 | 0:12:39 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi205 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201103 | 2019-08-08 23:33:26 | 2019-08-10 20:34:43 | 2019-08-10 21:22:43 | 0:48:00 | 0:15:30 | 0:32:30 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201104 | 2019-08-08 23:33:27 | 2019-08-10 20:34:47 | 2019-08-10 21:14:47 | 0:40:00 | 0:22:59 | 0:17:01 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T21:04:16.839996+0000 mon.b (mon.0) 957 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4201105 | 2019-08-08 23:33:28 | 2019-08-10 20:37:09 | 2019-08-10 21:57:09 | 1:20:00 | 0:39:14 | 0:40:46 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4201106 | 2019-08-08 23:33:29 | 2019-08-10 20:37:33 | 2019-08-10 20:59:32 | 0:21:59 | 0:11:36 | 0:10:23 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi113 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201107 | 2019-08-08 23:33:30 | 2019-08-10 20:37:38 | 2019-08-10 21:07:37 | 0:29:59 | 0:10:53 | 0:19:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201108 | 2019-08-08 23:33:31 | 2019-08-10 20:38:44 | 2019-08-10 21:08:43 | 0:29:59 | 0:15:07 | 0:14:52 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4201109 | 2019-08-08 23:33:32 | 2019-08-10 20:39:16 | 2019-08-10 21:05:15 | 0:25:59 | 0:10:41 | 0:15:18 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi089 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201110 | 2019-08-08 23:33:32 | 2019-08-10 20:39:37 | 2019-08-10 21:19:37 | 0:40:00 | 0:20:02 | 0:19:58 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201111 | 2019-08-08 23:33:33 | 2019-08-10 20:40:50 | 2019-08-10 21:46:50 | 1:06:00 | 0:17:59 | 0:48:01 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201112 | 2019-08-08 23:33:34 | 2019-08-10 20:41:46 | 2019-08-10 21:09:46 | 0:28:00 | 0:11:17 | 0:16:43 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi022 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201113 | 2019-08-08 23:33:35 | 2019-08-10 20:42:41 | 2019-08-10 21:58:41 | 1:16:00 | 0:40:50 | 0:35:10 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4201114 | 2019-08-08 23:33:36 | 2019-08-10 20:42:43 | 2019-08-10 21:12:43 | 0:30:00 | 0:17:43 | 0:12:17 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4201115 | 2019-08-08 23:33:37 | 2019-08-10 20:42:44 | 2019-08-10 21:08:43 | 0:25:59 | 0:11:54 | 0:14:05 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi091 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201116 | 2019-08-08 23:33:38 | 2019-08-10 20:43:01 | 2019-08-10 21:45:01 | 1:02:00 | 0:12:13 | 0:49:47 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi169 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201117 | 2019-08-08 23:33:38 | 2019-08-10 20:44:28 | 2019-08-10 21:10:27 | 0:25:59 | 0:11:31 | 0:14:28 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201118 | 2019-08-08 23:33:39 | 2019-08-10 20:44:28 | 2019-08-10 21:14:27 | 0:29:59 | 0:12:40 | 0:17:19 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi192 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201119 | 2019-08-08 23:33:40 | 2019-08-10 20:45:03 | 2019-08-10 21:03:02 | 0:17:59 | 0:11:22 | 0:06:37 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi093 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201120 | 2019-08-08 23:33:41 | 2019-08-10 20:46:50 | 2019-08-10 21:26:50 | 0:40:00 | 0:15:59 | 0:24:01 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi186 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201121 | 2019-08-08 23:33:42 | 2019-08-10 20:46:56 | 2019-08-10 23:36:58 | 2:50:02 | 2:12:36 | 0:37:26 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201122 | 2019-08-08 23:33:43 | 2019-08-10 20:47:21 | 2019-08-10 21:15:20 | 0:27:59 | 0:11:27 | 0:16:32 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201123 | 2019-08-08 23:33:44 | 2019-08-10 20:47:33 | 2019-08-10 21:07:33 | 0:20:00 | 0:06:54 | 0:13:06 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi103 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201124 | 2019-08-08 23:33:44 | 2019-08-09 05:45:52 | 2019-08-09 08:39:54 | 2:54:02 | 1:51:36 | 1:02:26 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4201125 | 2019-08-08 23:33:45 | 2019-08-10 20:49:51 | 2019-08-10 21:29:50 | 0:39:59 | 0:15:26 | 0:24:33 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi091 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201126 | 2019-08-08 23:33:46 | 2019-08-10 20:51:03 | 2019-08-10 21:29:03 | 0:38:00 | 0:22:24 | 0:15:36 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4201127 | 2019-08-08 23:33:47 | 2019-08-10 20:51:21 | 2019-08-10 21:29:20 | 0:37:59 | 0:16:50 | 0:21:09 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi053 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201128 | 2019-08-08 23:33:48 | 2019-08-10 20:52:34 | 2019-08-10 21:54:34 | 1:02:00 | 0:48:21 | 0:13:39 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4201129 | 2019-08-08 23:33:49 | 2019-08-10 20:52:36 | 2019-08-10 21:18:35 | 0:25:59 | 0:12:44 | 0:13:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi145 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201130 | 2019-08-08 23:33:49 | 2019-08-10 20:54:09 | 2019-08-10 21:24:08 | 0:29:59 | 0:16:33 | 0:13:26 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201131 | 2019-08-08 23:33:50 | 2019-08-10 20:54:35 | 2019-08-10 21:22:34 | 0:27:59 | 0:14:39 | 0:13:20 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi041 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201132 | 2019-08-08 23:33:51 | 2019-08-10 20:55:06 | 2019-08-10 21:33:06 | 0:38:00 | 0:19:49 | 0:18:11 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
fail | 4201133 | 2019-08-08 23:33:52 | 2019-08-10 20:55:27 | 2019-08-10 21:47:27 | 0:52:00 | 0:12:21 | 0:39:39 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi182 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201134 | 2019-08-08 23:33:53 | 2019-08-10 20:55:43 | 2019-08-10 21:35:42 | 0:39:59 | 0:13:32 | 0:26:27 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201135 | 2019-08-08 23:33:53 | 2019-08-10 20:58:01 | 2019-08-10 21:32:00 | 0:33:59 | 0:19:42 | 0:14:17 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201136 | 2019-08-08 23:33:54 | 2019-08-10 20:58:16 | 2019-08-10 21:42:16 | 0:44:00 | 0:17:35 | 0:26:25 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201137 | 2019-08-08 23:33:55 | 2019-08-10 20:59:26 | 2019-08-10 21:29:25 | 0:29:59 | 0:17:00 | 0:12:59 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi109 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201138 | 2019-08-08 23:33:56 | 2019-08-10 20:59:33 | 2019-08-10 21:31:33 | 0:32:00 | 0:14:38 | 0:17:22 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201139 | 2019-08-08 23:33:57 | 2019-08-10 20:59:43 | 2019-08-10 21:39:42 | 0:39:59 | 0:12:17 | 0:27:42 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
pass | 4201140 | 2019-08-08 23:33:58 | 2019-08-10 21:00:57 | 2019-08-10 22:24:58 | 1:24:01 | 0:51:06 | 0:32:55 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4201141 | 2019-08-08 23:33:59 | 2019-08-10 21:00:58 | 2019-08-10 21:46:57 | 0:45:59 | 0:06:54 | 0:39:05 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi053 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201142 | 2019-08-08 23:33:59 | 2019-08-10 21:02:30 | 2019-08-10 21:32:29 | 0:29:59 | 0:12:28 | 0:17:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi101 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201143 | 2019-08-08 23:34:00 | 2019-08-10 21:03:00 | 2019-08-10 21:34:59 | 0:31:59 | 0:11:18 | 0:20:41 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi203 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201144 | 2019-08-08 23:34:01 | 2019-08-10 21:03:00 | 2019-08-10 22:13:00 | 1:10:00 | 0:50:35 | 0:19:25 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4201145 | 2019-08-08 23:34:02 | 2019-08-10 21:03:03 | 2019-08-10 21:39:03 | 0:36:00 | 0:01:34 | 0:34:26 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
nv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'apt-get')Failure object was: {'smithi155.front.sepia.ceph.com': {'_ansible_no_log': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'enable_plugin': [], u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'http://satellite.front.sepia.ceph.com/pub/katello-ca-consumer-latest.noarch.rpm'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'state': u'present', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'validate_certs': False}}, 'changed': False, u'msg': u'yum lockfile is held by another process'}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'/')Failure object was: {'smithi168.front.sepia.ceph.com': {'_ansible_no_log': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'krb5-workstation'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'present', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'changed': False, u'msg': u'yum lockfile is held by another process'}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'/') |
||||||||||||||
fail | 4201146 | 2019-08-08 23:34:03 | 2019-08-10 21:04:01 | 2019-08-10 21:28:01 | 0:24:00 | 0:16:05 | 0:07:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201147 | 2019-08-08 23:34:04 | 2019-08-10 21:04:19 | 2019-08-10 21:28:19 | 0:24:00 | 0:11:44 | 0:12:16 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201148 | 2019-08-08 23:34:04 | 2019-08-10 21:05:31 | 2019-08-10 21:45:30 | 0:39:59 | 0:06:53 | 0:33:06 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi152 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201149 | 2019-08-08 23:34:05 | 2019-08-10 21:06:10 | 2019-08-10 21:52:09 | 0:45:59 | 0:33:48 | 0:12:11 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4201150 | 2019-08-08 23:34:06 | 2019-08-10 21:06:51 | 2019-08-10 21:44:55 | 0:38:04 | 0:06:51 | 0:31:13 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201151 | 2019-08-08 23:34:07 | 2019-08-10 21:06:52 | 2019-08-10 21:30:51 | 0:23:59 | 0:15:37 | 0:08:22 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi187 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201152 | 2019-08-08 23:34:08 | 2019-08-10 21:07:34 | 2019-08-10 21:29:33 | 0:21:59 | 0:06:49 | 0:15:10 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi167 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201153 | 2019-08-08 23:34:08 | 2019-08-09 05:45:52 | 2019-08-09 07:15:53 | 1:30:01 | 0:25:28 | 1:04:33 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-09T07:04:27.382854+0000 mon.a (mon.0) 779 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4201154 | 2019-08-08 23:34:09 | 2019-08-10 21:07:39 | 2019-08-10 22:15:39 | 1:08:00 | 0:36:08 | 0:31:52 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4201155 | 2019-08-08 23:34:10 | 2019-08-10 21:08:41 | 2019-08-10 21:56:41 | 0:48:00 | 0:11:15 | 0:36:45 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi161 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201156 | 2019-08-08 23:34:11 | 2019-08-10 21:08:44 | 2019-08-10 21:38:44 | 0:30:00 | 0:10:39 | 0:19:21 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi076 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201157 | 2019-08-08 23:34:12 | 2019-08-10 21:08:45 | 2019-08-10 22:34:45 | 1:26:00 | 0:14:34 | 1:11:26 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4201158 | 2019-08-08 23:34:13 | 2019-08-10 21:10:01 | 2019-08-10 22:10:01 | 1:00:00 | 0:12:15 | 0:47:45 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi173 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201159 | 2019-08-08 23:34:13 | 2019-08-10 21:10:28 | 2019-08-10 22:10:28 | 1:00:00 | 0:19:51 | 0:40:09 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201160 | 2019-08-08 23:34:14 | 2019-08-10 21:11:19 | 2019-08-10 21:45:18 | 0:33:59 | 0:17:28 | 0:16:31 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
dead | 4201161 | 2019-08-08 23:34:15 | 2019-08-10 21:11:46 | 2019-08-10 21:45:45 | 0:33:59 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | — | |||
Failure Reason:
reached maximum tries (60) after waiting for 900 seconds |
||||||||||||||
pass | 4201162 | 2019-08-08 23:34:16 | 2019-08-10 21:12:59 | 2019-08-10 22:18:58 | 1:05:59 | 0:35:45 | 0:30:14 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4201163 | 2019-08-08 23:34:17 | 2019-08-10 21:14:42 | 2019-08-10 21:38:41 | 0:23:59 | 0:06:50 | 0:17:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201164 | 2019-08-08 23:34:18 | 2019-08-10 21:14:46 | 2019-08-10 21:40:45 | 0:25:59 | 0:07:03 | 0:18:56 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi198 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201165 | 2019-08-08 23:34:18 | 2019-08-10 21:14:48 | 2019-08-10 21:44:48 | 0:30:00 | 0:11:12 | 0:18:48 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
nit__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'nagios-common')Failure object was: {'smithi155.front.sepia.ceph.com': {'msg': u'All items completed', 'changed': True, 'results': [{'ansible_loop_var': u'item', '_ansible_item_label': u'boost-random', 'failed': False, u'changed': True, u'results': [u'Loaded plugins: fastestmirror, langpacks, priorities, product-id, search-\n : disabled-repos, subscription-manager\nResolving Dependencies\n--> Running transaction check\n---> Package boost-random.x86_64 0:1.53.0-27.el7 will be erased\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nRemoving:\n boost-random x86_64 1.53.0-27.el7 @rhel-7-server-rpms 28 k\n\nTransaction Summary\n================================================================================\nRemove 1 Package\n\nInstalled size: 28 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Erasing : boost-random-1.53.0-27.el7.x86_64 1/1 \nLoading mirror speeds from cached hostfile\n * epel: ftp.linux.ncsu.edu\n112 packages excluded due to repository priority protections\n Verifying : boost-random-1.53.0-27.el7.x86_64 1/1 \n\nRemoved:\n boost-random.x86_64 0:1.53.0-27.el7 \n\nComplete!\n'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'boost-random'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'boost-random', u'rc': 0, u'msg': u'', u'changes': {u'removed': [u'boost-random']}, '_ansible_no_log': False}, {'ansible_loop_var': u'item', '_ansible_item_label': u'boost-program-options', 'failed': False, u'changed': False, u'results': [u'boost-program-options is not installed'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'boost-program-options'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'boost-program-options', u'rc': 0, u'msg': u'', '_ansible_no_log': False}, {'ansible_loop_var': u'item', '_ansible_item_label': u'leveldb', 'failed': False, u'changed': True, u'results': [u'Loaded plugins: fastestmirror, langpacks, priorities, product-id, search-\n : disabled-repos, subscription-manager\nResolving Dependencies\n--> Running transaction check\n---> Package leveldb.x86_64 0:1.12.0-11.el7 will be erased\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nRemoving:\n leveldb x86_64 1.12.0-11.el7 @epel 439 k\n\nTransaction Summary\n================================================================================\nRemove 1 Package\n\nInstalled size: 439 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Erasing : leveldb-1.12.0-11.el7.x86_64 1/1 \nLoading mirror speeds from cached hostfile\n * epel: ftp.linux.ncsu.edu\n112 packages excluded due to repository priority protections\n Verifying : leveldb-1.12.0-11.el7.x86_64 1/1 \n\nRemoved:\n leveldb.x86_64 0:1.12.0-11.el7 \n\nComplete!\n'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'leveldb'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'leveldb', u'rc': 0, u'msg': u'', u'changes': {u'removed': [u'leveldb']}, '_ansible_no_log': False}, {'ansible_loop_var': u'item', '_ansible_item_label': u'xmlstarlet', 'failed': False, u'changed': True, u'results': [u'Loaded plugins: fastestmirror, langpacks, priorities, product-id, search-\n : disabled-repos, subscription-manager\nResolving Dependencies\n--> Running transaction check\n---> Package xmlstarlet.x86_64 0:1.6.1-1.el7 will be erased\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nRemoving:\n xmlstarlet x86_64 1.6.1-1.el7 @epel 192 k\n\nTransaction Summary\n================================================================================\nRemove 1 Package\n\nInstalled size: 192 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Erasing : xmlstarlet-1.6.1-1.el7.x86_64 1/1 \nLoading mirror speeds from cached hostfile\n * epel: ftp.linux.ncsu.edu\n112 packages excluded due to repository priority protections\n Verifying : xmlstarlet-1.6.1-1.el7.x86_64 1/1 \n\nRemoved:\n xmlstarlet.x86_64 0:1.6.1-1.el7 \n\nComplete!\n'], u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'xmlstarlet'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, 'item': u'xmlstarlet', u'rc': 0, u'msg': u'', u'changes': {u'removed': [u'xmlstarlet']}, '_ansible_no_log': False}, {'item': u'python-jinja2', 'ansible_loop_var': u'item', '_ansible_item_label': u'python-jinja2', 'msg': u'SSH Error: data could not be sent to remote host "smithi155.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': u'python-ceph', 'ansible_loop_var': u'item', '_ansible_item_label': u'python-ceph', 'msg': u'SSH Error: data could not be sent to remote host "smithi155.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': u'python-flask', 'ansible_loop_var': u'item', '_ansible_item_label': u'python-flask', 'msg': u'SSH Error: data could not be sent to remote host "smithi155.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': u'python-requests', 'ansible_loop_var': u'item', '_ansible_item_label': u'python-requests', 'msg': u'SSH Error: data could not be sent to remote host "smithi155.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'ansible_loop_var': u'item', '_ansible_no_log': False, 'item': u'boost-random', u'failed': True, u'msg': u'The Python 2 bindings for rpm are needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.. The Python 2 yum module is needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.', 'changed': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'boost-random'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, '_ansible_item_label': u'boost-random'}, {'ansible_loop_var': u'item', '_ansible_no_log': False, 'item': u'python-urllib3', u'failed': True, u'msg': u'The Python 2 bindings for rpm are needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.. The Python 2 yum module is needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.', 'changed': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'python-urllib3'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, '_ansible_item_label': u'python-urllib3'}, {'ansible_loop_var': u'item', '_ansible_no_log': False, 'item': u'python-babel', u'failed': True, u'msg': u'The Python 2 bindings for rpm are needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.. The Python 2 yum module is needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.', 'changed': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'python-babel'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, '_ansible_item_label': u'python-babel'}, {'ansible_loop_var': u'item', '_ansible_no_log': False, 'item': u'hdparm', u'failed': True, u'msg': u'The Python 2 bindings for rpm are needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.. The Python 2 yum module is needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.', 'changed': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'hdparm'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, '_ansible_item_label': u'hdparm'}, {'ansible_loop_var': u'item', '_ansible_no_log': False, 'item': u'python-markupsafe', u'failed': True, u'msg': u'The Python 2 bindings for rpm are needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.. The Python 2 yum module is needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.', 'changed': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'python-markupsafe'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, '_ansible_item_label': u'python-markupsafe'}, {'ansible_loop_var': u'item', '_ansible_no_log': False, 'item': u'python-werkzeug', u'failed': True, u'msg': u'The Python 2 bindings for rpm are needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.. The Python 2 yum module is needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.', 'changed': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'python-werkzeug'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, '_ansible_item_label': u'python-werkzeug'}, {'ansible_loop_var': u'item', '_ansible_no_log': False, 'item': u'python-itsdangerous', u'failed': True, u'msg': u'The Python 2 bindings for rpm are needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.. The Python 2 yum module is needed for this module. If you require Python 3 support use the `dnf` Ansible module instead.', 'changed': False, u'invocation': {u'module_args': {u'install_weak_deps': True, u'autoremove': False, u'lock_timeout': 0, u'download_dir': None, u'install_repoquery': True, u'update_cache': False, u'conf_file': None, u'exclude': [], u'update_only': False, u'installroot': u'/', u'allow_downgrade': False, u'name': [u'python-itsdangerous'], u'download_only': False, u'bugfix': False, u'list': None, u'disable_gpg_check': False, u'disable_excludes': None, u'use_backend': u'auto', u'validate_certs': True, u'state': u'absent', u'disablerepo': [], u'releasever': None, u'disable_plugin': [], u'enablerepo': [], u'skip_broken': False, u'security': False, u'enable_plugin': []}}, '_ansible_item_label': u'python-itsdangerous'}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'boost-random') |
||||||||||||||
pass | 4201166 | 2019-08-08 23:34:19 | 2019-08-10 21:15:21 | 2019-08-10 22:15:21 | 1:00:00 | 0:11:28 | 0:48:32 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201167 | 2019-08-08 23:34:20 | 2019-08-10 21:17:08 | 2019-08-10 21:45:07 | 0:27:59 | 0:13:00 | 0:14:59 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi158 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
dead | 4201168 | 2019-08-08 23:34:21 | 2019-08-10 21:18:25 | 2019-08-10 21:44:24 | 0:25:59 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | — | |||
Failure Reason:
reached maximum tries (100) after waiting for 600 seconds |
||||||||||||||
fail | 4201169 | 2019-08-08 23:34:22 | 2019-08-10 21:18:36 | 2019-08-10 22:40:37 | 1:22:01 | 0:15:25 | 1:06:36 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi130 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201170 | 2019-08-08 23:34:22 | 2019-08-10 21:19:52 | 2019-08-10 23:33:53 | 2:14:01 | 1:57:49 | 0:16:12 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201171 | 2019-08-08 23:34:23 | 2019-08-10 21:20:13 | 2019-08-10 21:52:12 | 0:31:59 | 0:11:44 | 0:20:15 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201172 | 2019-08-08 23:34:24 | 2019-08-10 21:22:37 | 2019-08-10 22:20:37 | 0:58:00 | 0:06:49 | 0:51:11 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi036 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201173 | 2019-08-08 23:34:25 | 2019-08-10 21:22:37 | 2019-08-10 22:44:37 | 1:22:00 | 0:31:31 | 0:50:29 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4201174 | 2019-08-08 23:34:26 | 2019-08-10 21:22:37 | 2019-08-10 21:44:36 | 0:21:59 | 0:12:23 | 0:09:36 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi140 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201175 | 2019-08-08 23:34:26 | 2019-08-10 21:22:44 | 2019-08-10 23:48:46 | 2:26:02 | 0:20:38 | 2:05:24 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4201176 | 2019-08-08 23:34:27 | 2019-08-10 21:24:24 | 2019-08-10 22:22:23 | 0:57:59 | 0:19:02 | 0:38:57 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4201177 | 2019-08-08 23:34:28 | 2019-08-10 21:27:06 | 2019-08-10 21:49:05 | 0:21:59 | 0:06:54 | 0:15:05 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi008 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201178 | 2019-08-08 23:34:29 | 2019-08-10 21:27:08 | 2019-08-10 21:51:07 | 0:23:59 | 0:11:31 | 0:12:28 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi003 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
dead | 4201179 | 2019-08-08 23:34:30 | 2019-08-10 21:28:02 | 2019-08-10 21:46:01 | 0:17:59 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||||
Failure Reason:
SSH connection to smithi168 was lost: 'sudo apt-get update' |
||||||||||||||
fail | 4201180 | 2019-08-08 23:34:30 | 2019-08-10 21:28:34 | 2019-08-10 22:08:33 | 0:39:59 | 0:11:29 | 0:28:30 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi106 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201181 | 2019-08-08 23:34:31 | 2019-08-10 21:28:42 | 2019-08-10 21:46:41 | 0:17:59 | 0:06:53 | 0:11:06 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi173 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201182 | 2019-08-08 23:34:32 | 2019-08-10 21:29:04 | 2019-08-10 23:31:05 | 2:02:01 | 0:12:36 | 1:49:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi006 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201183 | 2019-08-08 23:34:33 | 2019-08-10 21:29:22 | 2019-08-10 21:57:21 | 0:27:59 | 0:12:23 | 0:15:36 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201184 | 2019-08-08 23:34:34 | 2019-08-10 21:29:27 | 2019-08-10 22:11:26 | 0:41:59 | 0:16:41 | 0:25:18 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201185 | 2019-08-08 23:34:34 | 2019-08-10 21:29:49 | 2019-08-10 22:41:49 | 1:12:00 | 0:52:56 | 0:19:04 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201186 | 2019-08-08 23:34:35 | 2019-08-10 21:29:51 | 2019-08-10 22:03:50 | 0:33:59 | 0:23:49 | 0:10:10 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
"2019-08-10T21:54:35.640092+0000 mon.b (mon.0) 906 : cluster [WRN] Health check failed: 2 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4201187 | 2019-08-08 23:34:36 | 2019-08-10 21:29:52 | 2019-08-10 21:53:51 | 0:23:59 | 0:11:25 | 0:12:34 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi203 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201188 | 2019-08-08 23:34:37 | 2019-08-10 21:30:28 | 2019-08-10 21:50:27 | 0:19:59 | 0:06:48 | 0:13:11 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201189 | 2019-08-08 23:34:38 | 2019-08-10 21:30:47 | 2019-08-10 22:46:47 | 1:16:00 | 0:51:31 | 0:24:29 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4201190 | 2019-08-08 23:34:38 | 2019-08-10 21:31:06 | 2019-08-10 21:55:05 | 0:23:59 | 0:12:26 | 0:11:33 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi101 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201191 | 2019-08-08 23:34:39 | 2019-08-10 21:31:34 | 2019-08-10 21:59:34 | 0:28:00 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
Failure Reason:
Stale jobs detected, aborting. |
||||||||||||||
fail | 4201192 | 2019-08-08 23:34:40 | 2019-08-10 21:31:48 | 2019-08-10 22:45:48 | 1:14:00 | 0:11:09 | 1:02:51 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi124 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201193 | 2019-08-08 23:34:41 | 2019-08-10 21:32:02 | 2019-08-10 22:38:02 | 1:06:00 | 0:52:02 | 0:13:58 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4201194 | 2019-08-08 23:34:42 | 2019-08-10 21:32:44 | 2019-08-10 23:10:45 | 1:38:01 | 0:06:52 | 1:31:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi198 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201195 | 2019-08-08 23:34:43 | 2019-08-10 21:32:54 | 2019-08-10 22:12:54 | 0:40:00 | 0:10:56 | 0:29:04 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi101 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201196 | 2019-08-08 23:34:43 | 2019-08-10 21:33:07 | 2019-08-10 22:09:07 | 0:36:00 | 0:12:40 | 0:23:20 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201197 | 2019-08-08 23:34:44 | 2019-08-10 21:34:26 | 2019-08-10 21:52:25 | 0:17:59 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||||
Failure Reason:
Command failed on smithi155 with status 1: 'sudo grub2-mkconfig -o /boot/grub2/grub.cfg' |
||||||||||||||
pass | 4201198 | 2019-08-08 23:34:45 | 2019-08-10 21:35:00 | 2019-08-10 22:21:00 | 0:46:00 | 0:24:57 | 0:21:03 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
pass | 4201199 | 2019-08-08 23:34:46 | 2019-08-10 21:35:58 | 2019-08-10 22:33:58 | 0:58:00 | 0:12:05 | 0:45:55 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4201200 | 2019-08-08 23:34:47 | 2019-08-10 21:36:44 | 2019-08-10 21:56:43 | 0:19:59 | 0:11:07 | 0:08:52 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi105 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201201 | 2019-08-08 23:34:47 | 2019-08-10 21:38:01 | 2019-08-10 21:58:00 | 0:19:59 | 0:06:50 | 0:13:09 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi033 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201202 | 2019-08-08 23:34:48 | 2019-08-10 21:38:43 | 2019-08-10 22:12:42 | 0:33:59 | 0:22:16 | 0:11:43 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T22:03:56.542852+0000 mon.b (mon.0) 765 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4201203 | 2019-08-08 23:34:49 | 2019-08-10 21:38:45 | 2019-08-10 22:12:45 | 0:34:00 | 0:11:54 | 0:22:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi110 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201204 | 2019-08-08 23:34:50 | 2019-08-10 21:39:18 | 2019-08-10 22:03:17 | 0:23:59 | 0:11:47 | 0:12:12 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi067 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201205 | 2019-08-08 23:34:51 | 2019-08-10 21:39:44 | 2019-08-10 22:01:43 | 0:21:59 | 0:11:17 | 0:10:42 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201206 | 2019-08-08 23:34:52 | 2019-08-10 21:41:01 | 2019-08-10 22:11:01 | 0:30:00 | 0:14:56 | 0:15:04 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4201207 | 2019-08-08 23:34:52 | 2019-08-10 21:42:34 | 2019-08-10 22:06:33 | 0:23:59 | 0:12:39 | 0:11:20 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
dead | 4201208 | 2019-08-08 23:34:53 | 2019-08-10 21:42:34 | 2019-08-10 21:54:33 | 0:11:59 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||||
Failure Reason:
SSH connection to smithi168 was lost: 'rpm -q kernel --last | head -n 1' |
||||||||||||||
pass | 4201209 | 2019-08-08 23:34:54 | 2019-08-10 21:44:42 | 2019-08-10 22:54:42 | 1:10:00 | 0:53:52 | 0:16:08 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201210 | 2019-08-08 23:34:55 | 2019-08-10 21:44:42 | 2019-08-10 23:34:43 | 1:50:01 | 0:13:36 | 1:36:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi182 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201211 | 2019-08-08 23:34:56 | 2019-08-10 21:44:50 | 2019-08-10 23:16:50 | 1:32:00 | 0:41:32 | 0:50:28 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4201212 | 2019-08-08 23:34:56 | 2019-08-10 21:44:53 | 2019-08-10 22:26:52 | 0:41:59 | 0:13:03 | 0:28:56 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi099 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201213 | 2019-08-08 23:34:57 | 2019-08-10 21:44:57 | 2019-08-10 22:04:56 | 0:19:59 | 0:06:50 | 0:13:09 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi117 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201214 | 2019-08-08 23:34:58 | 2019-08-10 21:45:03 | 2019-08-10 22:27:02 | 0:41:59 | 0:12:38 | 0:29:21 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201215 | 2019-08-08 23:34:59 | 2019-08-10 21:45:08 | 2019-08-10 22:19:08 | 0:34:00 | 0:12:04 | 0:21:56 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201216 | 2019-08-08 23:34:59 | 2019-08-10 21:45:20 | 2019-08-10 22:55:20 | 1:10:00 | 0:13:30 | 0:56:30 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi175 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201217 | 2019-08-08 23:35:00 | 2019-08-10 21:45:32 | 2019-08-10 22:09:31 | 0:23:59 | 0:09:22 | 0:14:37 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 1 | |
Failure Reason:
ak6ngfj0ohW6upHRUMq7g 450919c617f452e874ef2a8cfa5ca18f47d249450ef0b0374849e721b0c53a70\n', u'name': u'sage'}, 'ansible_loop_var': u'item', '_ansible_item_label': {u'ovpn': u'sage@autriche 2Uhfc/1jXje4W7G4s0sJEQ 09631754d059fab96e16be441b3b8f7e63b37a260677897a2669cf7a2501d2d3\nsage@gnit YuLp2iihT6cCqY+GmW9IVQ dc79b19c45b9be1b4e69cd7debeb5b87aaa8cd0050353f798913da22c233875a\nsage@maetl wpsQIdtAb+z+ietqr+90Zw 42c518015b7cd440e7c8afd65fcca5769e31e9ef8f8a67d109a3de38187087e6\nsage@lowride tak6ngfj0ohW6upHRUMq7g 450919c617f452e874ef2a8cfa5ca18f47d249450ef0b0374849e721b0c53a70\n', u'name': u'sage'}, 'msg': u'SSH Error: data could not be sent to remote host "smithi155.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': {u'ovpn': u'dgallowa@laptop POebac0SJIVkSCyi0LE4yg 52fc0cb34f3332a5f3cff117a20f00e8ea89a051736cb543a689a96dd9528ae5\ndgalloway@p50 7PHdbcNE/YjcHOm5NbNfpQ 7075211c35997b01797c2f15083aedf13888fc4c54080130342dccc76469fb40\n', u'name': u'dgalloway'}, 'ansible_loop_var': u'item', '_ansible_item_label': {u'ovpn': u'dgallowa@laptop POebac0SJIVkSCyi0LE4yg 52fc0cb34f3332a5f3cff117a20f00e8ea89a051736cb543a689a96dd9528ae5\ndgalloway@p50 7PHdbcNE/YjcHOm5NbNfpQ 7075211c35997b01797c2f15083aedf13888fc4c54080130342dccc76469fb40\n', u'name': u'dgalloway'}, 'msg': u'SSH Error: data could not be sent to remote host "smithi155.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': {u'ovpn': u'akraitma@aklap 2zvxeeQ+bQ4UhKBE9Kz2DQ ff4a9dbab841ee62e7328d8cd0d285c68c22a1de6e23ea24f470b3dcb1a27045', u'name': u'akraitma'}, 'ansible_loop_var': u'item', '_ansible_item_label': {u'ovpn': u'akraitma@aklap 2zvxeeQ+bQ4UhKBE9Kz2DQ ff4a9dbab841ee62e7328d8cd0d285c68c22a1de6e23ea24f470b3dcb1a27045', u'name': u'akraitma'}, 'msg': u'SSH Error: data could not be sent to remote host "smithi155.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1108, u'groups': u'sudo', u'home': u'/home/trociny', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1108, '_ansible_item_label': {u'ovpn': u'mgolub@mirantis.com +WMPxNCgiMwhNJAJHNCV2g 4cd6749ac740ea6dfb50d4f602920ac75e979286439bd510540a8a551728104e', u'name': u'trociny'}, 'item': {u'ovpn': u'mgolub@mirantis.com +WMPxNCgiMwhNJAJHNCV2g 4cd6749ac740ea6dfb50d4f602920ac75e979286439bd510540a8a551728104e', u'name': u'trociny'}, u'changed': False, u'name': u'trociny', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'trociny', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1109, u'groups': u'sudo', u'home': u'/home/smithfarm', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1109, '_ansible_item_label': {u'ovpn': u'smithfarm@wilbur vEk7281ao7DkpXE2Cn3rpw 20058c04c4991caf38b9517faedb75c8562e8d8669808b6be9f70da6a5b64007', u'name': u'smithfarm', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC6jqQtqO7pnW39mpSQQC4L2FuxPOx/5j34lp2PrwuTWJoDwtUVCZhpNwSs5UAj3wdXGXCzWYWxu9d9Gh0UEBkiC+2ZHtLvWrwS1TDVRy5g5jYVm9vX3kL5dhoC2vYPyQoQkxeMX3BYlhLwkSjCAnjaceJJZyOT7KsqbpanvJxvudMzeyCosntnMuxyPzjH9CkVKqycddZnscdHMkDIZ3TjNhnL8GKK/QCPB60tkCO6HWN/g4HxidRjrp2VjQgIa5MWN9LBR6mGspZI6+rR1ygLhbl1scr1cCCumh65r+zmCC/ZY33YCfRXdmQZo+ShsBa+KbcvakEx0Bqgx7DsMoiR ncutler@pantograf'}, 'item': {u'ovpn': u'smithfarm@wilbur vEk7281ao7DkpXE2Cn3rpw 20058c04c4991caf38b9517faedb75c8562e8d8669808b6be9f70da6a5b64007', u'name': u'smithfarm', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC6jqQtqO7pnW39mpSQQC4L2FuxPOx/5j34lp2PrwuTWJoDwtUVCZhpNwSs5UAj3wdXGXCzWYWxu9d9Gh0UEBkiC+2ZHtLvWrwS1TDVRy5g5jYVm9vX3kL5dhoC2vYPyQoQkxeMX3BYlhLwkSjCAnjaceJJZyOT7KsqbpanvJxvudMzeyCosntnMuxyPzjH9CkVKqycddZnscdHMkDIZ3TjNhnL8GKK/QCPB60tkCO6HWN/g4HxidRjrp2VjQgIa5MWN9LBR6mGspZI6+rR1ygLhbl1scr1cCCumh65r+zmCC/ZY33YCfRXdmQZo+ShsBa+KbcvakEx0Bqgx7DsMoiR ncutler@pantograf'}, u'changed': False, u'name': u'smithfarm', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'smithfarm', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1110, u'groups': u'sudo', u'home': u'/home/mbenjamin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1110, '_ansible_item_label': {u'ovpn': u'mbenjamin@ferret N7Ln6KHUAYT/iiP4Swkb0A 244b90574d7e294f6f80949ef5e640bab103c2c34e1902a69cdcf592607d950c\nmbenjamin@duel PRRFsQiomHSJX8sVgRHblA f3009adb7502a6c5554e1157d36c63d84f40fe5074a01d7d0280e5c48da374dc\nmbenjamin@rael 8ixfNNx3j2m2+nGhXkHsnQ 62fda7ec724f0fdb3f05add4c3739e07a5c7eeb2380d5f1f7eeeb13db35e16a8\nmbenjamin@shatner hJ1oEl5wSZaKPJC1Ugss8g 9c9161b53da4d842308f80e5b490a0c1539076f129bd0e34ee2253272dd55c21\n', u'name': u'mbenjamin'}, 'item': {u'ovpn': u'mbenjamin@ferret N7Ln6KHUAYT/iiP4Swkb0A 244b90574d7e294f6f80949ef5e640bab103c2c34e1902a69cdcf592607d950c\nmbenjamin@duel PRRFsQiomHSJX8sVgRHblA f3009adb7502a6c5554e1157d36c63d84f40fe5074a01d7d0280e5c48da374dc\nmbenjamin@rael 8ixfNNx3j2m2+nGhXkHsnQ 62fda7ec724f0fdb3f05add4c3739e07a5c7eeb2380d5f1f7eeeb13db35e16a8\nmbenjamin@shatner hJ1oEl5wSZaKPJC1Ugss8g 9c9161b53da4d842308f80e5b490a0c1539076f129bd0e34ee2253272dd55c21\n', u'name': u'mbenjamin'}, u'changed': False, u'name': u'mbenjamin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mbenjamin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1111, u'groups': u'sudo', u'home': u'/home/aemerson', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1111, '_ansible_item_label': {u'ovpn': u'aemerson@seasalt enqtMOCQ0a24hizRK46SZg 42f36c9c14c1eb7c468e41cf4c5649e30037f0aaf5eefbd6c07be9637224ca01', u'name': u'aemerson', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCxUDk9jEQOSrRwe0WMGBPw28fzgOhvqMgcG90SvuUjR20Rx3KUfy9JMxPJPesjBQz0xgH5adOVg2JQeLrt3uSJdIFNpqox8BBrS1u/bWT2dorQjLnFEhrtA8Ae/h2kwLQ8w3yYNWB/PxDSuMRnyH4G8EWtVydzQcV/QnNZ9Y6eBcHaI5C2yVnFl7Fi4zBYc2eeL8o8gWqmM6DobBCvVrxD0pCNAsLhOW6IA140BNGT9F/hQZYje9RQRWbFKh/iKiUhqFYCzLcxFfYXkw2HZMJA2p/bLmuc8ZbgYaIiU6b90kpfDB37Xw0S6toIj9E8h+E3nkNnwraCQcbralhz/bdz aemerson@seasalt'}, 'item': {u'ovpn': u'aemerson@seasalt enqtMOCQ0a24hizRK46SZg 42f36c9c14c1eb7c468e41cf4c5649e30037f0aaf5eefbd6c07be9637224ca01', u'name': u'aemerson', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCxUDk9jEQOSrRwe0WMGBPw28fzgOhvqMgcG90SvuUjR20Rx3KUfy9JMxPJPesjBQz0xgH5adOVg2JQeLrt3uSJdIFNpqox8BBrS1u/bWT2dorQjLnFEhrtA8Ae/h2kwLQ8w3yYNWB/PxDSuMRnyH4G8EWtVydzQcV/QnNZ9Y6eBcHaI5C2yVnFl7Fi4zBYc2eeL8o8gWqmM6DobBCvVrxD0pCNAsLhOW6IA140BNGT9F/hQZYje9RQRWbFKh/iKiUhqFYCzLcxFfYXkw2HZMJA2p/bLmuc8ZbgYaIiU6b90kpfDB37Xw0S6toIj9E8h+E3nkNnwraCQcbralhz/bdz aemerson@seasalt'}, u'changed': False, u'name': u'aemerson', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'aemerson', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1112, u'groups': u'sudo', u'home': u'/home/sbillah', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1112, '_ansible_item_label': {u'ovpn': u'sbillah@syed-machine qVcw+LuFQQxYW7QpzZ3aLA d028c4635289a781f3ebe26a545e084572613b03cc9cde7770018ad0259e4dc9', u'name': u'sbillah', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDEgeHHS5jx9k7QdOEGqZjaxEAPJ6vv/WZXJNpifBpm9Ba1FRA1U3qHV8oX/bBZ08HIBGg8hQOeZ5I7d5HyzR0971W6KVgDF+s6mRN7d+heNi3XmikbJrozLoEiVQNHIsXjUoc655c2y9NR9Lf5FBweSSrbE34jCUqTA3XmZOdbkjY+ngOcDIfNixRG0yZ57p6UqYW0I+Mg68CB7N+Lv4gFvH/968aML7ConABPGs+vnLdNSQbjuibnaoZwzeSgPoaBJEqBCgNkwO8TyaC04okMj2X7/FGxgZNhwF0V5SVpBllWlGqdAigEF0dher88PbzSIFSm/x8PeACSZWkU0QWV Masum@MASUM-PC'}, 'item': {u'ovpn': u'sbillah@syed-machine qVcw+LuFQQxYW7QpzZ3aLA d028c4635289a781f3ebe26a545e084572613b03cc9cde7770018ad0259e4dc9', u'name': u'sbillah', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDEgeHHS5jx9k7QdOEGqZjaxEAPJ6vv/WZXJNpifBpm9Ba1FRA1U3qHV8oX/bBZ08HIBGg8hQOeZ5I7d5HyzR0971W6KVgDF+s6mRN7d+heNi3XmikbJrozLoEiVQNHIsXjUoc655c2y9NR9Lf5FBweSSrbE34jCUqTA3XmZOdbkjY+ngOcDIfNixRG0yZ57p6UqYW0I+Mg68CB7N+Lv4gFvH/968aML7ConABPGs+vnLdNSQbjuibnaoZwzeSgPoaBJEqBCgNkwO8TyaC04okMj2X7/FGxgZNhwF0V5SVpBllWlGqdAigEF0dher88PbzSIFSm/x8PeACSZWkU0QWV Masum@MASUM-PC'}, u'changed': False, u'name': u'sbillah', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sbillah', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1113, u'groups': u'sudo', u'home': u'/home/amaredia', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1113, '_ansible_item_label': {u'ovpn': u'ali@freerunner yQjRpRVG7D5KN2HAUjI30g 9d677a34ae98477e6cc8ba1d975d81dcae43a102013b265c63f3ea91e7dacd78', u'name': u'amaredia', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCc1ZbHNXuJn7gxyt7QJ579gEM13MuFdTzsrnedYuRIW2Zlm4rCFr8Oj5SGs5DWBIJnd3W4O2v7PjuVQMxU/zbwnj7mdwBmLFe1cSzOJv2eP1R2uaU5z5C7KNmyPLU++pGKClzb6v5wcEQUq2K35xnuXUU9B935dK+Fm7bK7+HAxj+1vpVeycbPFyPhf6mwbx8dZv4uvZGV2+CGBuyIB/5U2AMJZy9LWim3AR35bip4ftXvSKlAON+RHhnS0toG/6uwp0XlFuGn5H8snaca7L6hGtB4xg1PqA5aMf33Jiv2NVLQo8emHU9J/HeNVS7ksoSZ6InynpLZ6b9uXa9OM9XL ali@parkour'}, 'item': {u'ovpn': u'ali@freerunner yQjRpRVG7D5KN2HAUjI30g 9d677a34ae98477e6cc8ba1d975d81dcae43a102013b265c63f3ea91e7dacd78', u'name': u'amaredia', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCc1ZbHNXuJn7gxyt7QJ579gEM13MuFdTzsrnedYuRIW2Zlm4rCFr8Oj5SGs5DWBIJnd3W4O2v7PjuVQMxU/zbwnj7mdwBmLFe1cSzOJv2eP1R2uaU5z5C7KNmyPLU++pGKClzb6v5wcEQUq2K35xnuXUU9B935dK+Fm7bK7+HAxj+1vpVeycbPFyPhf6mwbx8dZv4uvZGV2+CGBuyIB/5U2AMJZy9LWim3AR35bip4ftXvSKlAON+RHhnS0toG/6uwp0XlFuGn5H8snaca7L6hGtB4xg1PqA5aMf33Jiv2NVLQo8emHU9J/HeNVS7ksoSZ6InynpLZ6b9uXa9OM9XL ali@parkour'}, u'changed': False, u'name': u'amaredia', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'amaredia', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1114, u'groups': u'sudo', u'home': u'/home/tserlin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1114, '_ansible_item_label': {u'ovpn': u'tserlin@annarbor DlKe+OWBPcFAQtWMUAHnwg 6b268bd737ffa5dd38865575ccd444b92cb912c70f5b82dac41f9c50505df4a5', u'name': u'tserlin', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA2ok6CBUpOVGv2RFws44GtEP5SVxOi4Vie0WSZYLpD55rTfmOtsItN1d1EciVNTUyWuyzMeQGWC4JAd3/2l3gR/5ZwSvd7b/7TCNYxjAMEubRand0GxEoiKhpkJMMmJqcT0KefP8pr31MASWPuERj1+0/IbjJExsvrJaUjqeIfZ+DWR8dC2VYdcH3hsp6AE3mqKX/9693sxe8ROt6qY4WkpZcO4M90unOVa2CnJsYqKaaIC4z3fmKuHZpJZjiJMrg8rtuN4r7bnKWPEVGcahj+i74JWwKR5+2gntLpxw2chIBmf4qFu6HDplddig4V3I/2NLB8soBpgc+m8O7YyYl0w== thomas@easystreet'}, 'item': {u'ovpn': u'tserlin@annarbor DlKe+OWBPcFAQtWMUAHnwg 6b268bd737ffa5dd38865575ccd444b92cb912c70f5b82dac41f9c50505df4a5', u'name': u'tserlin', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA2ok6CBUpOVGv2RFws44GtEP5SVxOi4Vie0WSZYLpD55rTfmOtsItN1d1EciVNTUyWuyzMeQGWC4JAd3/2l3gR/5ZwSvd7b/7TCNYxjAMEubRand0GxEoiKhpkJMMmJqcT0KefP8pr31MASWPuERj1+0/IbjJExsvrJaUjqeIfZ+DWR8dC2VYdcH3hsp6AE3mqKX/9693sxe8ROt6qY4WkpZcO4M90unOVa2CnJsYqKaaIC4z3fmKuHZpJZjiJMrg8rtuN4r7bnKWPEVGcahj+i74JWwKR5+2gntLpxw2chIBmf4qFu6HDplddig4V3I/2NLB8soBpgc+m8O7YyYl0w== thomas@easystreet'}, u'changed': False, u'name': u'tserlin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'tserlin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1115, u'groups': u'sudo', u'home': u'/home/dis', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1115, '_ansible_item_label': {u'ovpn': u'dis@zambezi wXYUFvWPBlkCFz+mC4RD6A 11c464dfb2a27986e029f1915732a4f237baba4eade02bb045c8f0d13dfada28', u'name': u'dis'}, 'item': {u'ovpn': u'dis@zambezi wXYUFvWPBlkCFz+mC4RD6A 11c464dfb2a27986e029f1915732a4f237baba4eade02bb045c8f0d13dfada28', u'name': u'dis'}, u'changed': False, u'name': u'dis', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'dis', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1116, u'groups': u'sudo', u'home': u'/home/gregf', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1116, '_ansible_item_label': {u'ovpn': u'gregf@kai YhNrPfedhZjbvGjOfmotOA 440cf8595a87cd307790bbf79c3668455a0405945e2b271d873325de222cd72f\ngregf@pudgy VZrk8nWzg7pYLOrZru8dBg c1d1a0e469a134ccf5f5a5525631a6b83efa6970beec3b23809eb0daa3dca47f\ngfarnum@WorkMini2 +bAqcTdU7Ok9bGMcB3A84w 3cff1326561a23cf81dd6495373cb83ed149cee026c6374d72c19b483f4f1f07\ngfarnum@Macbook bxVtolCC9SY3QNlpx3cE1w aff8d28bfb4d693253511d29e8d399196e964fc096594ec705748a5469d44654\ngregf@fedoragreg Jdn8I/sFGcr5Aa/dici6lw 50f88afc35c05ef8454742226f7baf2cd20cb1e2d4d0c9f4a393013877736bfa\n', u'name': u'gregf'}, 'item': {u'ovpn': u'gregf@kai YhNrPfedhZjbvGjOfmotOA 440cf8595a87cd307790bbf79c3668455a0405945e2b271d873325de222cd72f\ngregf@pudgy VZrk8nWzg7pYLOrZru8dBg c1d1a0e469a134ccf5f5a5525631a6b83efa6970beec3b23809eb0daa3dca47f\ngfarnum@WorkMini2 +bAqcTdU7Ok9bGMcB3A84w 3cff1326561a23cf81dd6495373cb83ed149cee026c6374d72c19b483f4f1f07\ngfarnum@Macbook bxVtolCC9SY3QNlpx3cE1w aff8d28bfb4d693253511d29e8d399196e964fc096594ec705748a5469d44654\ngregf@fedoragreg Jdn8I/sFGcr5Aa/dici6lw 50f88afc35c05ef8454742226f7baf2cd20cb1e2d4d0c9f4a393013877736bfa\n', u'name': u'gregf'}, u'changed': False, u'name': u'gregf', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gregf', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1117, u'groups': u'sudo', u'home': u'/home/joshd', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1117, '_ansible_item_label': {u'ovpn': u'jdurgin@glarthnir ElGaAgbs5VZujuzsQMfCmA 2a156febba037d02d1099bc11d1e697d34300b2c420f2df664b5b0de1248f983\njdurgin@new-angeles jqa015PRJcHSp5WHcwJjUg 42113e1156382fde866d691f30584f6b30c3dfc21317ae89b4267efb177d982c\n', u'name': u'joshd'}, 'item': {u'ovpn': u'jdurgin@glarthnir ElGaAgbs5VZujuzsQMfCmA 2a156febba037d02d1099bc11d1e697d34300b2c420f2df664b5b0de1248f983\njdurgin@new-angeles jqa015PRJcHSp5WHcwJjUg 42113e1156382fde866d691f30584f6b30c3dfc21317ae89b4267efb177d982c\n', u'name': u'joshd'}, u'changed': False, u'name': u'joshd', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'joshd', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1118, u'groups': u'sudo', u'home': u'/home/davidz', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1118, '_ansible_item_label': {u'ovpn': u'dzafman@ubuntu-laptop-vm NY9y9tqLY1beEcXDwMavsQ c869d42fae1890574577a8014d97d1247f1a13cb6337037d2714f1d236fc65d2\ndzafman@ubuntu16 2a0rAy5QmNFSEcATNz2h9A b7c11fbb0911fc4ac0216a1a8eac8359a9e8f43d69126db6b45cbeabd358c2b4\ndzafman@ubuntu-1804 PN1pkeGHGloB0K+IZrfB0g f1c01b447b9ec3fc048c32f606a33fb488ff621e11aa305ac979501030202658\n', u'name': u'davidz'}, 'item': {u'ovpn': u'dzafman@ubuntu-laptop-vm NY9y9tqLY1beEcXDwMavsQ c869d42fae1890574577a8014d97d1247f1a13cb6337037d2714f1d236fc65d2\ndzafman@ubuntu16 2a0rAy5QmNFSEcATNz2h9A b7c11fbb0911fc4ac0216a1a8eac8359a9e8f43d69126db6b45cbeabd358c2b4\ndzafman@ubuntu-1804 PN1pkeGHGloB0K+IZrfB0g f1c01b447b9ec3fc048c32f606a33fb488ff621e11aa305ac979501030202658\n', u'name': u'davidz'}, u'changed': False, u'name': u'davidz', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'davidz', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1119, u'groups': u'sudo', u'home': u'/home/gmeno', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1119, '_ansible_item_label': {u'ovpn': u'gmeno@gmeno-virtual-machine FKFu8B2pMqotpmEVAO1few 8229574e499eaf767a408909f5afdf2e2a0bb8f3e61b18d63a651f7102c68dbc', u'name': u'gmeno'}, 'item': {u'ovpn': u'gmeno@gmeno-virtual-machine FKFu8B2pMqotpmEVAO1few 8229574e499eaf767a408909f5afdf2e2a0bb8f3e61b18d63a651f7102c68dbc', u'name': u'gmeno'}, u'changed': False, u'name': u'gmeno', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gmeno', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1120, u'groups': u'sudo', u'home': u'/home/ivancich', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1120, '_ansible_item_label': {u'ovpn': u'ivancich@ann.arbor Kt2vxZ3Ge609mHfjx0W4Cw aaa55a9e2b5276b62a21cd3c401b365c5c2693e39efccb2f9edefafefa1dc8b1', u'name': u'ivancich'}, 'item': {u'ovpn': u'ivancich@ann.arbor Kt2vxZ3Ge609mHfjx0W4Cw aaa55a9e2b5276b62a21cd3c401b365c5c2693e39efccb2f9edefafefa1dc8b1', u'name': u'ivancich'}, u'changed': False, u'name': u'ivancich', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ivancich', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1121, u'groups': u'sudo', u'home': u'/home/wusui', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1121, '_ansible_item_label': {u'ovpn': u'wusui@ubuntu /nElLTGqxiOr/hD6eziF5A 2e244e2b62fa42dadf3a3a1dbcc29475410cee6550b0c6b3603c135853937551\nwusui@thinkpad tu2DxDcllIdwb5ewldgT0g 1590a7d9f1377b0094e9ba4277e7bcbe6374791f0b3d3df93026345c058c93f5\n', u'name': u'wusui'}, 'item': {u'ovpn': u'wusui@ubuntu /nElLTGqxiOr/hD6eziF5A 2e244e2b62fa42dadf3a3a1dbcc29475410cee6550b0c6b3603c135853937551\nwusui@thinkpad tu2DxDcllIdwb5ewldgT0g 1590a7d9f1377b0094e9ba4277e7bcbe6374791f0b3d3df93026345c058c93f5\n', u'name': u'wusui'}, u'changed': False, u'name': u'wusui', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'wusui', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1122, u'groups': u'sudo', u'home': u'/home/zyan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1122, '_ansible_item_label': {u'ovpn': u'zyan@redhat OP16IkReCatMfA4Mf3pkdQ b0262be71ef008e2f7d585e34431dc2757c1e22ac1aa844863be533bf873d304', u'name': u'zyan'}, 'item': {u'ovpn': u'zyan@redhat OP16IkReCatMfA4Mf3pkdQ b0262be71ef008e2f7d585e34431dc2757c1e22ac1aa844863be533bf873d304', u'name': u'zyan'}, u'changed': False, u'name': u'zyan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'zyan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1123, u'groups': u'sudo', u'home': u'/home/yuriw', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1123, '_ansible_item_label': {u'ovpn': u'yuriweinstein@Yuris-MacBook-Pro.local wKA8mCcsdsk/CE+1d9GiPw caaf5bf5eb31ef269e3d0bc34d45dd761c0bb0907172eea6e754434de4603db7\nyuriw@home 02TZyR3JHJMxEQob80ICNA 85b4aa0f69f6d469dae0bb3dca4baaf222e164927ed7eed2082caae8f4717e48\nyuriweinstein@xenon1 C9eVdLb/i18lMcMG20rGPw eaddd0e831a77de3f35cb19e307bd6f9aeb0cc03eff1ae58490d7db33ced4311\nyuriw@yuriw-RH 5ivdxgFO4eIkbXVhl8xkvw 59212d29b8b42d9fe457c1b2c43d774e1d25807be10dcc1252d4aec63b97a467\n', u'name': u'yuriw'}, 'item': {u'ovpn': u'yuriweinstein@Yuris-MacBook-Pro.local wKA8mCcsdsk/CE+1d9GiPw caaf5bf5eb31ef269e3d0bc34d45dd761c0bb0907172eea6e754434de4603db7\nyuriw@home 02TZyR3JHJMxEQob80ICNA 85b4aa0f69f6d469dae0bb3dca4baaf222e164927ed7eed2082caae8f4717e48\nyuriweinstein@xenon1 C9eVdLb/i18lMcMG20rGPw eaddd0e831a77de3f35cb19e307bd6f9aeb0cc03eff1ae58490d7db33ced4311\nyuriw@yuriw-RH 5ivdxgFO4eIkbXVhl8xkvw 59212d29b8b42d9fe457c1b2c43d774e1d25807be10dcc1252d4aec63b97a467\n', u'name': u'yuriw'}, u'changed': False, u'name': u'yuriw', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yuriw', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1124, u'groups': u'sudo', u'home': u'/home/tamil', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1124, '_ansible_item_label': {u'ovpn': u'tmuthamizhan@mac /1CBLC6rxqzJzPspffZV0g 4d1dfff2e097a7fc2a83ea73eccad2f0e453a6338e18c87b4d89bf370733e29c\ntamil@tamil-VirtualBox M22QdmhkSPj9aEcTiuIVfg 8e76e06b14de99318441c75a96e635a92f5bddc54a40b45276191f6829c6b239\n', u'name': u'tamil'}, 'item': {u'ovpn': u'tmuthamizhan@mac /1CBLC6rxqzJzPspffZV0g 4d1dfff2e097a7fc2a83ea73eccad2f0e453a6338e18c87b4d89bf370733e29c\ntamil@tamil-VirtualBox M22QdmhkSPj9aEcTiuIVfg 8e76e06b14de99318441c75a96e635a92f5bddc54a40b45276191f6829c6b239\n', u'name': u'tamil'}, u'changed': False, u'name': u'tamil', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'tamil', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1125, u'groups': u'sudo', u'home': u'/home/jowilkin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1125, '_ansible_item_label': {u'ovpn': u'jowilkin@jowilkin 2r8xOv/eCTcHr+HSsMkPYg 0ac416d2dc139628144dfa0822af8cc0a455f5f5f3e4d1b9713c14115c062218\njohn@osd-host 7zjDTxAYhCmTX+Az4SJaoA 7d924233fdef168e2c5c01258aa349de108629ef2ff90d17c0b96acf22dac7c2\njohn@admin-host 7cpk7iJ1Hg2vk4bPDovKmA 05765178f27af6dc4e43e47f52d773aac3bc1b3f1dd998bdbf479b951bfd2efb\n', u'name': u'jowilkin'}, 'item': {u'ovpn': u'jowilkin@jowilkin 2r8xOv/eCTcHr+HSsMkPYg 0ac416d2dc139628144dfa0822af8cc0a455f5f5f3e4d1b9713c14115c062218\njohn@osd-host 7zjDTxAYhCmTX+Az4SJaoA 7d924233fdef168e2c5c01258aa349de108629ef2ff90d17c0b96acf22dac7c2\njohn@admin-host 7cpk7iJ1Hg2vk4bPDovKmA 05765178f27af6dc4e43e47f52d773aac3bc1b3f1dd998bdbf479b951bfd2efb\n', u'name': u'jowilkin'}, u'changed': False, u'name': u'jowilkin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jowilkin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1126, u'groups': u'sudo', u'home': u'/home/bhubbard', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1126, '_ansible_item_label': {u'ovpn': u'brad@dhcp-1-165.bne.redhat.com 4oShQI9+vYtX5gA47np/Sw 3fc7df5afa772752d8eee15c01d550cc1dcc88b6e940abc9f9f8f26102d239d4', u'name': u'bhubbard'}, 'item': {u'ovpn': u'brad@dhcp-1-165.bne.redhat.com 4oShQI9+vYtX5gA47np/Sw 3fc7df5afa772752d8eee15c01d550cc1dcc88b6e940abc9f9f8f26102d239d4', u'name': u'bhubbard'}, u'changed': False, u'name': u'bhubbard', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'bhubbard', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1127, u'groups': u'sudo', u'home': u'/home/yehudasa', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1127, '_ansible_item_label': {u'ovpn': u'yehudasa@yehuda-2 NGXHeO0lAvRFfYLalffzEw 0d3489b09f48ad5fa9d1ec5944d9c8020daf852488f389f3d44fe63d3f651f34\nyehuda@yehuda-940X3G shisK5LjI6fr3ZBJy/xX8A 49522899bd26130086ce668079f0062d987d85dfa5767dd5c34e5953db97997a\nyehudasa@yehudasa-desktop OT1MhoO0WihhvkKztqW0Uw 12a4d6b54390b9df7f5af3bd6b533f3c1fee0c7b9fbb79f0a87bcb28b182c7d4\n', u'name': u'yehudasa'}, 'item': {u'ovpn': u'yehudasa@yehuda-2 NGXHeO0lAvRFfYLalffzEw 0d3489b09f48ad5fa9d1ec5944d9c8020daf852488f389f3d44fe63d3f651f34\nyehuda@yehuda-940X3G shisK5LjI6fr3ZBJy/xX8A 49522899bd26130086ce668079f0062d987d85dfa5767dd5c34e5953db97997a\nyehudasa@yehudasa-desktop OT1MhoO0WihhvkKztqW0Uw 12a4d6b54390b9df7f5af3bd6b533f3c1fee0c7b9fbb79f0a87bcb28b182c7d4\n', u'name': u'yehudasa'}, u'changed': False, u'name': u'yehudasa', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yehudasa', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1128, u'groups': u'sudo', u'home': u'/home/dang', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1128, '_ansible_item_label': {u'ovpn': u'dang@sidious w0CNW2g9K1WiRenVGYWNUA 4f59d761bfab3659115da2b3b80a486266f77b09d8527983217d15648b4f92b4', u'name': u'dang'}, 'item': {u'ovpn': u'dang@sidious w0CNW2g9K1WiRenVGYWNUA 4f59d761bfab3659115da2b3b80a486266f77b09d8527983217d15648b4f92b4', u'name': u'dang'}, u'changed': False, u'name': u'dang', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'dang', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1129, u'groups': u'sudo', u'home': u'/home/branto', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1129, '_ansible_item_label': {u'ovpn': u'branto@work ganye+HpG3dkMEik6WtTng 018f3f9b9d49dcefa701ea304a8e58f002c46f0650edae220a0a7ab1bce36aeb', u'name': u'branto'}, 'item': {u'ovpn': u'branto@work ganye+HpG3dkMEik6WtTng 018f3f9b9d49dcefa701ea304a8e58f002c46f0650edae220a0a7ab1bce36aeb', u'name': u'branto'}, u'changed': False, u'name': u'branto', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'branto', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1130, u'groups': u'sudo', u'home': u'/home/xiaoxichen', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1130, '_ansible_item_label': {u'ovpn': u'xiaoxichen@ebay RvJJ7BhIehpoPtggrwnskQ 862ecfe7e15dfab86d61df86856bfe06cbb99f240f6f03851f7f9e1a255327d6', u'name': u'xiaoxichen'}, 'item': {u'ovpn': u'xiaoxichen@ebay RvJJ7BhIehpoPtggrwnskQ 862ecfe7e15dfab86d61df86856bfe06cbb99f240f6f03851f7f9e1a255327d6', u'name': u'xiaoxichen'}, u'changed': False, u'name': u'xiaoxichen', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'xiaoxichen', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1131, u'groups': u'sudo', u'home': u'/home/ffilz', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1131, '_ansible_item_label': {u'ovpn': u'ffilz@redhat.com 6YdvqxkKfmDWGD2s0wA7Ww 4ce64d08686e34e559ccec2498df433b155b70c9ebccaec616b6b34f0f0c246e', u'name': u'ffilz'}, 'item': {u'ovpn': u'ffilz@redhat.com 6YdvqxkKfmDWGD2s0wA7Ww 4ce64d08686e34e559ccec2498df433b155b70c9ebccaec616b6b34f0f0c246e', u'name': u'ffilz'}, u'changed': False, u'name': u'ffilz', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ffilz', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1132, u'groups': u'sudo', u'home': u'/home/joao', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1132, '_ansible_item_label': {u'ovpn': u'joao@magrathea eSS2/gvK7ALE6L9bITfuMA c3caaeecee3f43e39b7a81fad50e0d874359c70a9c41b77c661511c71f733909\njoao@timesink 9S3oER36HheVupjRpnLz6A 9dbc964184244e9da269942dc73ec9ebba6594bcccfdc0eb09562b58b4542162\n', u'name': u'joao'}, 'item': {u'ovpn': u'joao@magrathea eSS2/gvK7ALE6L9bITfuMA c3caaeecee3f43e39b7a81fad50e0d874359c70a9c41b77c661511c71f733909\njoao@timesink 9S3oER36HheVupjRpnLz6A 9dbc964184244e9da269942dc73ec9ebba6594bcccfdc0eb09562b58b4542162\n', u'name': u'joao'}, u'changed': False, u'name': u'joao', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'joao', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1133, u'groups': u'sudo', u'home': u'/home/nhm', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1133, '_ansible_item_label': {u'ovpn': u'nh@latte JCH9icAtDPj951rgbPKJyw 9a3d30ec1ec9467ccdc8bdbbfacffd7396fb38e046199ae37b2b7b69dbf37480\nnhm@espresso +YYZPT29wYzY5ooaRzabCQ 1ee041dd58b9ec6eb678c47632ece7cf6c24e23bcbac28a77a82af05ba6cc148\nnhm@mocha HgOGOfkBEzJihFsKmPRfKQ 2e17f3ba0b90df7a36f19a7c8f64d2aa8f966b2794c94caa110d313e927a1c1b\n', u'name': u'nhm'}, 'item': {u'ovpn': u'nh@latte JCH9icAtDPj951rgbPKJyw 9a3d30ec1ec9467ccdc8bdbbfacffd7396fb38e046199ae37b2b7b69dbf37480\nnhm@espresso +YYZPT29wYzY5ooaRzabCQ 1ee041dd58b9ec6eb678c47632ece7cf6c24e23bcbac28a77a82af05ba6cc148\nnhm@mocha HgOGOfkBEzJihFsKmPRfKQ 2e17f3ba0b90df7a36f19a7c8f64d2aa8f966b2794c94caa110d313e927a1c1b\n', u'name': u'nhm'}, u'changed': False, u'name': u'nhm', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'nhm', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1134, u'groups': u'sudo', u'home': u'/home/jj', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1134, '_ansible_item_label': {u'ovpn': u'jj@aurora PFzIfG7NjrCYxJvxd//soA 395884ddf2caa0043f97e4f98530cc3ad646d6f269802a2b79ca1ea7278ee006\njj@metatron iQogIw/KQtewT7oj9Mkivw 0881e5ceb5897e8a370bacee69ad58eb5098090ea4b0d53972214ea7c751e35a\njj@laptop O1e31whZbQ0S7MUtglCRLg 96e39257989ce36e240b5d368e0308d38009d3d923ec398dc9cc6eba371acaa4\njj@aurora2 EtAvlrozxiL3PLYp6mvATg 1018928736c33ed06246f208cd02aa10c0a6efa5b4e34e32408d7a6c72c32e11\n', u'name': u'jj'}, 'item': {u'ovpn': u'jj@aurora PFzIfG7NjrCYxJvxd//soA 395884ddf2caa0043f97e4f98530cc3ad646d6f269802a2b79ca1ea7278ee006\njj@metatron iQogIw/KQtewT7oj9Mkivw 0881e5ceb5897e8a370bacee69ad58eb5098090ea4b0d53972214ea7c751e35a\njj@laptop O1e31whZbQ0S7MUtglCRLg 96e39257989ce36e240b5d368e0308d38009d3d923ec398dc9cc6eba371acaa4\njj@aurora2 EtAvlrozxiL3PLYp6mvATg 1018928736c33ed06246f208cd02aa10c0a6efa5b4e34e32408d7a6c72c32e11\n', u'name': u'jj'}, u'changed': False, u'name': u'jj', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jj', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1135, u'groups': u'sudo', u'home': u'/home/nwatkins', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1135, '_ansible_item_label': {u'ovpn': u'nwatkins@daq unHfXkUpAHOnzeptznuLLA 33b2003c30d0cc7a6b194e76be92c7b5d270c2d2a4b4a8b6e673f0f0dc1db313', u'name': u'nwatkins'}, 'item': {u'ovpn': u'nwatkins@daq unHfXkUpAHOnzeptznuLLA 33b2003c30d0cc7a6b194e76be92c7b5d270c2d2a4b4a8b6e673f0f0dc1db313', u'name': u'nwatkins'}, u'changed': False, u'name': u'nwatkins', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'nwatkins', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1136, u'groups': u'sudo', u'home': u'/home/mkidd', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1136, '_ansible_item_label': {u'ovpn': u'linuxkidd@zenbook oYp1WWV0JwpikHCWCV52Lg 9aca455b601bf3a365d31068154150ac63dd76f32cef29a55f9685dd1a88aa22', u'name': u'mkidd'}, 'item': {u'ovpn': u'linuxkidd@zenbook oYp1WWV0JwpikHCWCV52Lg 9aca455b601bf3a365d31068154150ac63dd76f32cef29a55f9685dd1a88aa22', u'name': u'mkidd'}, u'changed': False, u'name': u'mkidd', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mkidd', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1137, u'groups': u'sudo', u'home': u'/home/jlopez', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1137, '_ansible_item_label': {u'ovpn': u'JCLO@oh-mbp-jcl vZhmBh/1LjLFEu+atRec6w 1f13f591373b4dc798a9b701fabf1eb99bf4aa58f87b6420d6c916716f0965af', u'name': u'jlopez'}, 'item': {u'ovpn': u'JCLO@oh-mbp-jcl vZhmBh/1LjLFEu+atRec6w 1f13f591373b4dc798a9b701fabf1eb99bf4aa58f87b6420d6c916716f0965af', u'name': u'jlopez'}, u'changed': False, u'name': u'jlopez', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jlopez', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1138, u'groups': u'sudo', u'home': u'/home/haomaiwang', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1138, '_ansible_item_label': {u'ovpn': u'yuyuyu101@desktop XAUS/1Geh1T2WY//5mRahw fda03bdaf79c2f39ac3ba6cd9c3a1cb2e66b842a921169f20a00481a4cd3d9cb', u'name': u'haomaiwang'}, 'item': {u'ovpn': u'yuyuyu101@desktop XAUS/1Geh1T2WY//5mRahw fda03bdaf79c2f39ac3ba6cd9c3a1cb2e66b842a921169f20a00481a4cd3d9cb', u'name': u'haomaiwang'}, u'changed': False, u'name': u'haomaiwang', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'haomaiwang', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1139, u'groups': u'sudo', u'home': u'/home/jdillaman', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1139, '_ansible_item_label': {u'ovpn': u'jdillaman@ceph1 kQ96pIpKTocwIj9H1Udb1g 6c087451af94ac18c144712bcc9329799d86f6d90376839dcd6fa41cd73e3608', u'name': u'jdillaman'}, 'item': {u'ovpn': u'jdillaman@ceph1 kQ96pIpKTocwIj9H1Udb1g 6c087451af94ac18c144712bcc9329799d86f6d90376839dcd6fa41cd73e3608', u'name': u'jdillaman'}, u'changed': False, u'name': u'jdillaman', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jdillaman', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1140, u'groups': u'sudo', u'home': u'/home/kchai', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1140, '_ansible_item_label': {u'ovpn': u'kefu@gen8 HVoNrB5C8+bYxuqIfByeEQ 4dddde1890af2d6df367d3d832cc3b9b660160a1db69f0135e0d09364b2cb9b3', u'name': u'kchai'}, 'item': {u'ovpn': u'kefu@gen8 HVoNrB5C8+bYxuqIfByeEQ 4dddde1890af2d6df367d3d832cc3b9b660160a1db69f0135e0d09364b2cb9b3', u'name': u'kchai'}, u'changed': False, u'name': u'kchai', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kchai', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1141, u'groups': u'sudo', u'home': u'/home/vumrao', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1141, '_ansible_item_label': {u'ovpn': u'vumrao@vumrao VaMHdnIGTl6y9LIkurxfjQ 71de53c4a0f212b8f919437d7d433d24a33d7a33bc6fe5df5d047e24499994b2', u'name': u'vumrao'}, 'item': {u'ovpn': u'vumrao@vumrao VaMHdnIGTl6y9LIkurxfjQ 71de53c4a0f212b8f919437d7d433d24a33d7a33bc6fe5df5d047e24499994b2', u'name': u'vumrao'}, u'changed': False, u'name': u'vumrao', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vumrao', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1142, u'groups': u'sudo', u'home': u'/home/dfuller', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1142, '_ansible_item_label': {u'ovpn': u'dfuller@laptop 6U0lNVidRy7Ye/5q6cGN1A aea3d019f68a95094c99385aff099818224455a829615cfc774587e4519398a7\ndfuller@rex001 6Z8bfQDgPXVSGMeeQHoItg 3abd41920b72683fbba7f25be88ff992fcd753119c4d2086c12daaf20798e684\n', u'name': u'dfuller'}, 'item': {u'ovpn': u'dfuller@laptop 6U0lNVidRy7Ye/5q6cGN1A aea3d019f68a95094c99385aff099818224455a829615cfc774587e4519398a7\ndfuller@rex001 6Z8bfQDgPXVSGMeeQHoItg 3abd41920b72683fbba7f25be88ff992fcd753119c4d2086c12daaf20798e684\n', u'name': u'dfuller'}, u'changed': False, u'name': u'dfuller', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'dfuller', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1143, u'groups': u'sudo', u'home': u'/home/owasserm', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1143, '_ansible_item_label': {u'ovpn': u'owasserm@owasserm.redhat.com hsdVTbVub6eRnhlO9B02rQ 7c9baf41670ff9ab612f75d4be42d0aaf0d7ecaa3c8928032b61f1be91725890\n', u'name': u'owasserm'}, 'item': {u'ovpn': u'owasserm@owasserm.redhat.com hsdVTbVub6eRnhlO9B02rQ 7c9baf41670ff9ab612f75d4be42d0aaf0d7ecaa3c8928032b61f1be91725890\n', u'name': u'owasserm'}, u'changed': False, u'name': u'owasserm', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'owasserm', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1144, u'groups': u'sudo', u'home': u'/home/abhishekvrshny', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1144, '_ansible_item_label': {u'ovpn': u'abhishekvrshny@flipkart QTqbWHaqvXwB+yBy6CVO7A 25d026c49dc49b3a1f445d2dc0099d5ed916645b0adb8d0306269ace7a2096e9', u'name': u'abhishekvrshny'}, 'item': {u'ovpn': u'abhishekvrshny@flipkart QTqbWHaqvXwB+yBy6CVO7A 25d026c49dc49b3a1f445d2dc0099d5ed916645b0adb8d0306269ace7a2096e9', u'name': u'abhishekvrshny'}, u'changed': False, u'name': u'abhishekvrshny', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'abhishekvrshny', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1145, u'groups': u'sudo', u'home': u'/home/vasu', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1145, '_ansible_item_label': {u'ovpn': u'vasu@ceph1 +1D1qNUAk8h7OF9LF9qkrQ 963aa256fb99bc4b77e7085f57cf910a04c7f143603f81252331411eb8e37ec3\nvakulkar@vakulkar.sjc.csb O8ac1k0Dh3xkIFX8NFyIZw 471538eeb22384b58921e4f11af272c00c0a953dc7fe8d95ba057e65d141fbd2\nvasu@vasuSrv waJqYAARY/LnfuP1x/KQzQ 68915d3a1eb3dd00a562c149791cec5f43a96f5fd0b851ec855ec3f5dab496b4\n', u'name': u'vasu'}, 'item': {u'ovpn': u'vasu@ceph1 +1D1qNUAk8h7OF9LF9qkrQ 963aa256fb99bc4b77e7085f57cf910a04c7f143603f81252331411eb8e37ec3\nvakulkar@vakulkar.sjc.csb O8ac1k0Dh3xkIFX8NFyIZw 471538eeb22384b58921e4f11af272c00c0a953dc7fe8d95ba057e65d141fbd2\nvasu@vasuSrv waJqYAARY/LnfuP1x/KQzQ 68915d3a1eb3dd00a562c149791cec5f43a96f5fd0b851ec855ec3f5dab496b4\n', u'name': u'vasu'}, u'changed': False, u'name': u'vasu', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vasu', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1146, u'groups': u'sudo', u'home': u'/home/smohan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1146, '_ansible_item_label': {u'ovpn': u'shmohan@laptop 7wwHLZZNa4ShUV1imXDDjw 0aca19a8ff6dbeee2821dd75a329a05b8052170204b2b242ced9b1a68ca8df37', u'name': u'smohan'}, 'item': {u'ovpn': u'shmohan@laptop 7wwHLZZNa4ShUV1imXDDjw 0aca19a8ff6dbeee2821dd75a329a05b8052170204b2b242ced9b1a68ca8df37', u'name': u'smohan'}, u'changed': False, u'name': u'smohan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'smohan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1147, u'groups': u'sudo', u'home': u'/home/abhi', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1147, '_ansible_item_label': {u'ovpn': u'abhi@trusty SKarQTpBigBobP9sLjdLiw 868a74ed21b46f7f64255897d824f4e3eb21f8dde844bbdaa386681c942d8114', u'name': u'abhi'}, 'item': {u'ovpn': u'abhi@trusty SKarQTpBigBobP9sLjdLiw 868a74ed21b46f7f64255897d824f4e3eb21f8dde844bbdaa386681c942d8114', u'name': u'abhi'}, u'changed': False, u'name': u'abhi', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'abhi', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1148, u'groups': u'sudo', u'home': u'/home/shehbazj', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1148, '_ansible_item_label': {u'ovpn': u'shehbazj@smrhost oEliXj/jPmfXv9AUAFKB5A 8bb3a682b4ff15de655c0fe610d350c5805d0a970471e4810b648f47e2811246', u'name': u'shehbazj'}, 'item': {u'ovpn': u'shehbazj@smrhost oEliXj/jPmfXv9AUAFKB5A 8bb3a682b4ff15de655c0fe610d350c5805d0a970471e4810b648f47e2811246', u'name': u'shehbazj'}, u'changed': False, u'name': u'shehbazj', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'shehbazj', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1149, u'groups': u'sudo', u'home': u'/home/cbodley', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1149, '_ansible_item_label': {u'ovpn': u'cbodley@redhat.com W4gFS4oU8PtUOrdHuBYwXQ af7dafd992687f5d85a79866838a78e5070a4934fb0a935e8094adb31ec28611', u'name': u'cbodley'}, 'item': {u'ovpn': u'cbodley@redhat.com W4gFS4oU8PtUOrdHuBYwXQ af7dafd992687f5d85a79866838a78e5070a4934fb0a935e8094adb31ec28611', u'name': u'cbodley'}, u'changed': False, u'name': u'cbodley', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'cbodley', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1150, u'groups': u'sudo', u'home': u'/home/fche', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1150, '_ansible_item_label': {u'ovpn': u'fche@redhat.com LiziWkWg2uWoEHb3Ln92dQ 9c5497793758b069adbba9284dd55944276ba4dac0bb95d9357c81b58174a3c3', u'name': u'fche'}, 'item': {u'ovpn': u'fche@redhat.com LiziWkWg2uWoEHb3Ln92dQ 9c5497793758b069adbba9284dd55944276ba4dac0bb95d9357c81b58174a3c3', u'name': u'fche'}, u'changed': False, u'name': u'fche', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'fche', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1151, u'groups': u'sudo', u'home': u'/home/onyb', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1151, '_ansible_item_label': {u'ovpn': u'ani@orchid AgbLf9Dufuji5r+9WxM69g b73b7eacb9b628387a17cb1e0a84ff19c29d45dca8f0768e407aa599bc6996c4', u'name': u'onyb'}, 'item': {u'ovpn': u'ani@orchid AgbLf9Dufuji5r+9WxM69g b73b7eacb9b628387a17cb1e0a84ff19c29d45dca8f0768e407aa599bc6996c4', u'name': u'onyb'}, u'changed': False, u'name': u'onyb', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'onyb', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1152, u'groups': u'sudo', u'home': u'/home/mwatts', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1152, '_ansible_item_label': {u'ovpn': u'mdw@degu IPiZqcFT2BLuf2h3+tw58g 7af390a631ec11bddd7d1ac506d29af65e1e01e19f7dc931b4f459030cb7a195', u'name': u'mwatts'}, 'item': {u'ovpn': u'mdw@degu IPiZqcFT2BLuf2h3+tw58g 7af390a631ec11bddd7d1ac506d29af65e1e01e19f7dc931b4f459030cb7a195', u'name': u'mwatts'}, u'changed': False, u'name': u'mwatts', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mwatts', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1153, u'groups': u'sudo', u'home': u'/home/oprypin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1153, '_ansible_item_label': {u'ovpn': u'oprypin@bx-pc-arch 5WYD7PWKwdSQjfGo95ehWw f3c9f170d74c6c443cf0f82f0d87045e1d3a0dbcd01474f6e667ea20a00978b3', u'name': u'oprypin'}, 'item': {u'ovpn': u'oprypin@bx-pc-arch 5WYD7PWKwdSQjfGo95ehWw f3c9f170d74c6c443cf0f82f0d87045e1d3a0dbcd01474f6e667ea20a00978b3', u'name': u'oprypin'}, u'changed': False, u'name': u'oprypin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'oprypin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1154, u'groups': u'sudo', u'home': u'/home/prsrivas', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1154, '_ansible_item_label': {u'ovpn': u'pritha@dhcp35-190.lab.eng.blr.redhat.com GCk3PIYzUNZ6/xrsKoq8VQ acbfc4279970b44c3d008990e0cf1bb5eb280299218441a0f25fda988bc555f6', u'name': u'prsrivas'}, 'item': {u'ovpn': u'pritha@dhcp35-190.lab.eng.blr.redhat.com GCk3PIYzUNZ6/xrsKoq8VQ acbfc4279970b44c3d008990e0cf1bb5eb280299218441a0f25fda988bc555f6', u'name': u'prsrivas'}, u'changed': False, u'name': u'prsrivas', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'prsrivas', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1155, u'groups': u'sudo', u'home': u'/home/pdonnell', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1155, '_ansible_item_label': {u'ovpn': u'pdonnell@redhat.com Q9j56CPPXipXScmKi57PlQ fb616603b6d27cf65bfa1da83fc0ca39399861ad1c02bfed37ce9be17cdfa8ea', u'name': u'pdonnell'}, 'item': {u'ovpn': u'pdonnell@redhat.com Q9j56CPPXipXScmKi57PlQ fb616603b6d27cf65bfa1da83fc0ca39399861ad1c02bfed37ce9be17cdfa8ea', u'name': u'pdonnell'}, u'changed': False, u'name': u'pdonnell', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pdonnell', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1156, u'groups': u'sudo', u'home': u'/home/jlayton', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1156, '_ansible_item_label': {u'ovpn': u'jlayton@redhat aNfzMdXOhhmWRb25hwXJIg f51fee42c5268f7b8e00d57092dc522b0a07b31154ea52cf542da9cac5885868', u'name': u'jlayton'}, 'item': {u'ovpn': u'jlayton@redhat aNfzMdXOhhmWRb25hwXJIg f51fee42c5268f7b8e00d57092dc522b0a07b31154ea52cf542da9cac5885868', u'name': u'jlayton'}, u'changed': False, u'name': u'jlayton', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jlayton', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1157, u'groups': u'sudo', u'home': u'/home/rzarzynski', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1157, '_ansible_item_label': {u'ovpn': u'rzarzyns@redhat.com h4UkibgFG40ygfpfKTMBrg e20ca28c60144dbabc97953cd4c273c1b92cd45ebcddd0f0299679d7a5c87d7f', u'name': u'rzarzynski'}, 'item': {u'ovpn': u'rzarzyns@redhat.com h4UkibgFG40ygfpfKTMBrg e20ca28c60144dbabc97953cd4c273c1b92cd45ebcddd0f0299679d7a5c87d7f', u'name': u'rzarzynski'}, u'changed': False, u'name': u'rzarzynski', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rzarzynski', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1158, u'groups': u'sudo', u'home': u'/home/rdias', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1158, '_ansible_item_label': {u'ovpn': u'rdias@rdias-suse-laptop 0bh54sdB69mh95n5rWME5g 452e3338e48d04d4e816f4f1cb54d637746a7acc1ffe5e8ed4c1506c8e07a72e', u'name': u'rdias'}, 'item': {u'ovpn': u'rdias@rdias-suse-laptop 0bh54sdB69mh95n5rWME5g 452e3338e48d04d4e816f4f1cb54d637746a7acc1ffe5e8ed4c1506c8e07a72e', u'name': u'rdias'}, u'changed': False, u'name': u'rdias', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rdias', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1159, u'groups': u'sudo', u'home': u'/home/asheplyakov', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1159, '_ansible_item_label': {u'ovpn': u'asheplyakov@asheplyakov.srt.mirantis.net wFW0ZgT4cNhKRAGXiUtevQ 1b11f0702b2db42a42aae6579737ece2caad3b80a8186b971686575cb76b3051', u'name': u'asheplyakov'}, 'item': {u'ovpn': u'asheplyakov@asheplyakov.srt.mirantis.net wFW0ZgT4cNhKRAGXiUtevQ 1b11f0702b2db42a42aae6579737ece2caad3b80a8186b971686575cb76b3051', u'name': u'asheplyakov'}, u'changed': False, u'name': u'asheplyakov', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'asheplyakov', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1160, u'groups': u'sudo', u'home': u'/home/vshankar', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1160, '_ansible_item_label': {u'ovpn': u'vshankar@h3ckers-pride r39frQXXj1GUJZwq1GS7fw 1170ef4c918c5ff15334d10f666441b0dfe0bb869a5e15218fdfad2e8cc4e953', u'name': u'vshankar'}, 'item': {u'ovpn': u'vshankar@h3ckers-pride r39frQXXj1GUJZwq1GS7fw 1170ef4c918c5ff15334d10f666441b0dfe0bb869a5e15218fdfad2e8cc4e953', u'name': u'vshankar'}, u'changed': False, u'name': u'vshankar', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vshankar', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1161, u'groups': u'sudo', u'home': u'/home/akupczyk', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1161, '_ansible_item_label': {u'ovpn': u'adam@TP50 C0YuBT9bYaNhdDmjbF56xg 5d298b33b9dbaef364b037561aa5c5de374405bb8afead5280db5b212506ea58', u'name': u'akupczyk'}, 'item': {u'ovpn': u'adam@TP50 C0YuBT9bYaNhdDmjbF56xg 5d298b33b9dbaef364b037561aa5c5de374405bb8afead5280db5b212506ea58', u'name': u'akupczyk'}, u'changed': False, u'name': u'akupczyk', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'akupczyk', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1162, u'groups': u'sudo', u'home': u'/home/nojha', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1162, '_ansible_item_label': {u'ovpn': u'nojha@localhost YQTw/h6ZMgdJn7EPBmbEnw 253574eae62759c4c5d3bc4bf949c59948a0488e4dfe4c91ee754a3b5494847e', u'name': u'nojha'}, 'item': {u'ovpn': u'nojha@localhost YQTw/h6ZMgdJn7EPBmbEnw 253574eae62759c4c5d3bc4bf949c59948a0488e4dfe4c91ee754a3b5494847e', u'name': u'nojha'}, u'changed': False, u'name': u'nojha', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'nojha', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1163, u'groups': u'sudo', u'home': u'/home/ifed01', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1163, '_ansible_item_label': {u'ovpn': u'ifed01@LocalSUSE1 6g6hX1bzTGBTCnDevAn0+w f18c9354f6de3f371c3b51521b62375e474802ac21adb3d71e09d8d5bf9d0c43', u'name': u'ifed01'}, 'item': {u'ovpn': u'ifed01@LocalSUSE1 6g6hX1bzTGBTCnDevAn0+w f18c9354f6de3f371c3b51521b62375e474802ac21adb3d71e09d8d5bf9d0c43', u'name': u'ifed01'}, u'changed': False, u'name': u'ifed01', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ifed01', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1164, u'groups': u'sudo', u'home': u'/home/myoungwon', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1164, '_ansible_item_label': {u'ovpn': u'myoungwon@omw-dev 78twXwYRU+MeH+yZ9Rw9Zg 2dd66fa122e5cf3b8bfa835cefac7c6e4e66d70643a3819813104c2057e597e4', u'name': u'myoungwon'}, 'item': {u'ovpn': u'myoungwon@omw-dev 78twXwYRU+MeH+yZ9Rw9Zg 2dd66fa122e5cf3b8bfa835cefac7c6e4e66d70643a3819813104c2057e597e4', u'name': u'myoungwon'}, u'changed': False, u'name': u'myoungwon', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'myoungwon', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1165, u'groups': u'sudo', u'home': u'/home/jwilliamson', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1165, '_ansible_item_label': {u'ovpn': u'jwilliamson@glaurung +d028NM7xCxkVdxxO+b1Lw fece65125073fdc287af724ee4724ad84d2864e758d50dcb23c07b05c3595fe0', u'name': u'jwilliamson'}, 'item': {u'ovpn': u'jwilliamson@glaurung +d028NM7xCxkVdxxO+b1Lw fece65125073fdc287af724ee4724ad84d2864e758d50dcb23c07b05c3595fe0', u'name': u'jwilliamson'}, u'changed': False, u'name': u'jwilliamson', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jwilliamson', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1166, u'groups': u'sudo', u'home': u'/home/gabrioux', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1166, '_ansible_item_label': {u'ovpn': u'gabrioux@elisheba 80kx1htp39RsFrlGONcp+A a95579ef6f90694cd6fd390302adf8532237a8ea65bd5544d9b561654d712ba2', u'name': u'gabrioux'}, 'item': {u'ovpn': u'gabrioux@elisheba 80kx1htp39RsFrlGONcp+A a95579ef6f90694cd6fd390302adf8532237a8ea65bd5544d9b561654d712ba2', u'name': u'gabrioux'}, u'changed': False, u'name': u'gabrioux', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gabrioux', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1167, u'groups': u'sudo', u'home': u'/home/leseb', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1167, '_ansible_item_label': {u'ovpn': u'leseb@mbp cCx1v5/FfaQ/IQHujqtG9Q 6121d11f9abfa6b1b36330eafaa2196249a9c92f989be25c9fac1558292c920f', u'name': u'leseb'}, 'item': {u'ovpn': u'leseb@mbp cCx1v5/FfaQ/IQHujqtG9Q 6121d11f9abfa6b1b36330eafaa2196249a9c92f989be25c9fac1558292c920f', u'name': u'leseb'}, u'changed': False, u'name': u'leseb', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'leseb', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1168, u'groups': u'sudo', u'home': u'/home/hchen', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1168, '_ansible_item_label': {u'ovpn': u'hchen@host12 hUr3k0rugStZMjvIxIvCOg 9d57e14d49901f18b24ee4076ae7e6a2f9eb6fd9fbce786660c448486c966fca', u'name': u'hchen'}, 'item': {u'ovpn': u'hchen@host12 hUr3k0rugStZMjvIxIvCOg 9d57e14d49901f18b24ee4076ae7e6a2f9eb6fd9fbce786660c448486c966fca', u'name': u'hchen'}, u'changed': False, u'name': u'hchen', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'hchen', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1169, u'groups': u'sudo', u'home': u'/home/jcollin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1169, '_ansible_item_label': {u'ovpn': u'jcollin@earth +H4Hk4WcNuqdQj7ch/Nulw 8426545e6457c9e1e8adca2af5ddf836fbcfb433cdc5359fd135afdf4e0f7d2a\njcollin@stratocaster jbjV3FsrsTJwyKUA3Y8VVQ 0439745f795fef1399636bd550040d45445d1607b471284c5c9b9dbccc86a987\n', u'name': u'jcollin'}, 'item': {u'ovpn': u'jcollin@earth +H4Hk4WcNuqdQj7ch/Nulw 8426545e6457c9e1e8adca2af5ddf836fbcfb433cdc5359fd135afdf4e0f7d2a\njcollin@stratocaster jbjV3FsrsTJwyKUA3Y8VVQ 0439745f795fef1399636bd550040d45445d1607b471284c5c9b9dbccc86a987\n', u'name': u'jcollin'}, u'changed': False, u'name': u'jcollin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jcollin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1170, u'groups': u'sudo', u'home': u'/home/xxg', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1170, '_ansible_item_label': {u'ovpn': u'xxg@zte Y2d/Ov201XMivDNwo4nUoQ 5e5da8d579793601699af628300430c1e5dd469c8bcff7c3ee11d23ec004bdcc', u'name': u'xxg'}, 'item': {u'ovpn': u'xxg@zte Y2d/Ov201XMivDNwo4nUoQ 5e5da8d579793601699af628300430c1e5dd469c8bcff7c3ee11d23ec004bdcc', u'name': u'xxg'}, u'changed': False, u'name': u'xxg', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'xxg', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1171, u'groups': u'sudo', u'home': u'/home/pcuzner', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1171, '_ansible_item_label': {u'ovpn': u'pcuzner@rh460p oK28wU5DSabvEL4VjDRhEg a449ed81d7e2970f418263fb3ce10dd711d03925a0990ddf298f826aae1caa53', u'name': u'pcuzner'}, 'item': {u'ovpn': u'pcuzner@rh460p oK28wU5DSabvEL4VjDRhEg a449ed81d7e2970f418263fb3ce10dd711d03925a0990ddf298f826aae1caa53', u'name': u'pcuzner'}, u'changed': False, u'name': u'pcuzner', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pcuzner', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1172, u'groups': u'sudo', u'home': u'/home/liupan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1172, '_ansible_item_label': {u'ovpn': u'liupan@ceph-dev-2 9+Is4mIZgNkYyJLwHvSNOA 5a8fafc187d52041daf4365125692d4619fc557b75560913130c0596f83bbb77', u'name': u'liupan'}, 'item': {u'ovpn': u'liupan@ceph-dev-2 9+Is4mIZgNkYyJLwHvSNOA 5a8fafc187d52041daf4365125692d4619fc557b75560913130c0596f83bbb77', u'name': u'liupan'}, u'changed': False, u'name': u'liupan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'liupan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1173, u'groups': u'sudo', u'home': u'/home/mkogan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1173, '_ansible_item_label': {u'ovpn': u'mkogan@mkP50 f/LENRJbsyepmvZA23F7Fg b908f1c0237a7ee56b73dc42f2df79b49ca83d6f4573f5229e7cfe6b4ad7b6a2', u'name': u'mkogan'}, 'item': {u'ovpn': u'mkogan@mkP50 f/LENRJbsyepmvZA23F7Fg b908f1c0237a7ee56b73dc42f2df79b49ca83d6f4573f5229e7cfe6b4ad7b6a2', u'name': u'mkogan'}, u'changed': False, u'name': u'mkogan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mkogan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1174, u'groups': u'sudo', u'home': u'/home/amarangone', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1174, '_ansible_item_label': {u'ovpn': u'amarangone@macair.local 9Fslt44BqONCYNhf+uhcnQ 12d46ec6815378a12abc5df00e65235ccbc06ffb0fe5d1db75540a4805cb58b6', u'name': u'amarangone'}, 'item': {u'ovpn': u'amarangone@macair.local 9Fslt44BqONCYNhf+uhcnQ 12d46ec6815378a12abc5df00e65235ccbc06ffb0fe5d1db75540a4805cb58b6', u'name': u'amarangone'}, u'changed': False, u'name': u'amarangone', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'amarangone', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1141, u'groups': u'sudo', u'home': u'/home/vumrao', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1141, '_ansible_item_label': {u'ovpn': u'vumrao@redhat TMNqzMvbJS8Va/8nT9QUQw ab386c2bd7c6796d5413e4d841a16dda2504cca6d95df831a652a30d2e5655ed', u'name': u'vumrao'}, 'item': {u'ovpn': u'vumrao@redhat TMNqzMvbJS8Va/8nT9QUQw ab386c2bd7c6796d5413e4d841a16dda2504cca6d95df831a652a30d2e5655ed', u'name': u'vumrao'}, u'changed': False, u'name': u'vumrao', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vumrao', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1175, u'groups': u'sudo', u'home': u'/home/kmroz', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1175, '_ansible_item_label': {u'ovpn': u'kmroz@suse /JbrIs2mKL5exdmcDnhRgg db4d19ab99c7174429d5ae7b6ca3cf4e04e9bf7810e1826d90f4627643628d57', u'name': u'kmroz'}, 'item': {u'ovpn': u'kmroz@suse /JbrIs2mKL5exdmcDnhRgg db4d19ab99c7174429d5ae7b6ca3cf4e04e9bf7810e1826d90f4627643628d57', u'name': u'kmroz'}, u'changed': False, u'name': u'kmroz', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kmroz', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1176, u'groups': u'sudo', u'home': u'/home/henrix', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1176, '_ansible_item_label': {u'ovpn': u'henrix@hermes iPPDBfnLzP5Pe5FuTcJBmw b26aefb8a61451066f984e074f708ea9ca6b2c5d7cca35996c08b0b2bb2c2736', u'name': u'henrix'}, 'item': {u'ovpn': u'henrix@hermes iPPDBfnLzP5Pe5FuTcJBmw b26aefb8a61451066f984e074f708ea9ca6b2c5d7cca35996c08b0b2bb2c2736', u'name': u'henrix'}, u'changed': False, u'name': u'henrix', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'henrix', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1177, u'groups': u'sudo', u'home': u'/home/pbs1108', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1177, '_ansible_item_label': {u'ovpn': u'pbs1108@bspark_cli0 nSCINdeTLTLEO5JP/GIwRQ 76372ad6f7ad731556ff13605c3729eacaf59dcf7f9ac82dd9a8501bd95d3b26', u'name': u'pbs1108'}, 'item': {u'ovpn': u'pbs1108@bspark_cli0 nSCINdeTLTLEO5JP/GIwRQ 76372ad6f7ad731556ff13605c3729eacaf59dcf7f9ac82dd9a8501bd95d3b26', u'name': u'pbs1108'}, u'changed': False, u'name': u'pbs1108', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pbs1108', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1178, u'groups': u'sudo', u'home': u'/home/clacroix', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1178, '_ansible_item_label': {u'ovpn': u'clacroix@redhat.com ZGY9sgvjT0BuJRi9zrULzg c3311aba4025aa42cd78c999dcee4e2c16415a3ac44ac8c95b77838459ef3315', u'name': u'clacroix'}, 'item': {u'ovpn': u'clacroix@redhat.com ZGY9sgvjT0BuJRi9zrULzg c3311aba4025aa42cd78c999dcee4e2c16415a3ac44ac8c95b77838459ef3315', u'name': u'clacroix'}, u'changed': False, u'name': u'clacroix', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'clacroix', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1179, u'groups': u'sudo', u'home': u'/home/epuertat', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1179, '_ansible_item_label': {u'ovpn': u'epuertat@private BnUoirwxGRWXtLxulJU5xA da2cfc4995bed82ef85db3633edad0a7eb2c32ba559a48259b10be94a8fdf006', u'name': u'epuertat'}, 'item': {u'ovpn': u'epuertat@private BnUoirwxGRWXtLxulJU5xA da2cfc4995bed82ef85db3633edad0a7eb2c32ba559a48259b10be94a8fdf006', u'name': u'epuertat'}, u'changed': False, u'name': u'epuertat', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'epuertat', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1180, u'groups': u'sudo', u'home': u'/home/tdehler', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1180, '_ansible_item_label': {u'ovpn': u'tdehler@think 7e0WC4Vh86XWZviZ9WBMgw 4dc8477db6e4f40312e6b2b9db293dc009e49e518015ace20431c0fb69025461', u'name': u'tdehler'}, 'item': {u'ovpn': u'tdehler@think 7e0WC4Vh86XWZviZ9WBMgw 4dc8477db6e4f40312e6b2b9db293dc009e49e518015ace20431c0fb69025461', u'name': u'tdehler'}, u'changed': False, u'name': u'tdehler', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'tdehler', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1181, u'groups': u'sudo', u'home': u'/home/laura', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1181, '_ansible_item_label': {u'ovpn': u'laura@flab 2DGHIAxD39eNKOPkn3M67w ab1ae304abed3824a68b5c0ecf4f92fca76a4f8b9fcbcc0ca43388a85b7f9305', u'name': u'laura'}, 'item': {u'ovpn': u'laura@flab 2DGHIAxD39eNKOPkn3M67w ab1ae304abed3824a68b5c0ecf4f92fca76a4f8b9fcbcc0ca43388a85b7f9305', u'name': u'laura'}, u'changed': False, u'name': u'laura', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'laura', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1182, u'groups': u'sudo', u'home': u'/home/adamyanova', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1182, '_ansible_item_label': {u'ovpn': u'adamyanova@centos k2nfk8hmGW77nys3R/JkOg 7695b95c2f157d60622e0b0b7ab06fa2cb18661a190d839f7ea587bc44aa0e3c\nadamyanova@ubuntu FmlKgjVzPUxfNDnHeU9vLQ ef7d5524863dfa0787fc5e249873c1a5ea58e7fd5aee27e1d1d33d6f87388a2d\n', u'name': u'adamyanova'}, 'item': {u'ovpn': u'adamyanova@centos k2nfk8hmGW77nys3R/JkOg 7695b95c2f157d60622e0b0b7ab06fa2cb18661a190d839f7ea587bc44aa0e3c\nadamyanova@ubuntu FmlKgjVzPUxfNDnHeU9vLQ ef7d5524863dfa0787fc5e249873c1a5ea58e7fd5aee27e1d1d33d6f87388a2d\n', u'name': u'adamyanova'}, u'changed': False, u'name': u'adamyanova', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'adamyanova', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1183, u'groups': u'sudo', u'home': u'/home/yaarit', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1183, '_ansible_item_label': {u'ovpn': u'yaarit@centos TBuWjkAsj1GB/V9eWc/R1Q 7bd86a857dec48dc25850ecf0c00486d9a89c2ff5f88b2f28c3e36bdeb139fce', u'name': u'yaarit'}, 'item': {u'ovpn': u'yaarit@centos TBuWjkAsj1GB/V9eWc/R1Q 7bd86a857dec48dc25850ecf0c00486d9a89c2ff5f88b2f28c3e36bdeb139fce', u'name': u'yaarit'}, u'changed': False, u'name': u'yaarit', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yaarit', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1184, u'groups': u'sudo', u'home': u'/home/rpavani1998', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1184, '_ansible_item_label': {u'ovpn': u'admin1@rajula 31GbDo9d1YnW5BQ8u3utvw a2da13cb840f848846023c85442ba7bcce97dc186056a0ecc036a220d7eb7fc3', u'name': u'rpavani1998'}, 'item': {u'ovpn': u'admin1@rajula 31GbDo9d1YnW5BQ8u3utvw a2da13cb840f848846023c85442ba7bcce97dc186056a0ecc036a220d7eb7fc3', u'name': u'rpavani1998'}, u'changed': False, u'name': u'rpavani1998', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rpavani1998', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1185, u'groups': u'sudo', u'home': u'/home/rishabh', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1185, '_ansible_item_label': {u'ovpn': u'rishabh@p50 zdJ4XsBdVugwMrqJOSBi3Q c78bb28ba5cf2bf9c8edb80fe57814d60cd2ffdbd874cf9a271e5adf171bb0c4', u'name': u'rishabh'}, 'item': {u'ovpn': u'rishabh@p50 zdJ4XsBdVugwMrqJOSBi3Q c78bb28ba5cf2bf9c8edb80fe57814d60cd2ffdbd874cf9a271e5adf171bb0c4', u'name': u'rishabh'}, u'changed': False, u'name': u'rishabh', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rishabh', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1186, u'groups': u'sudo', u'home': u'/home/skrah', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1186, '_ansible_item_label': {u'ovpn': u'skrah@thinkpad IboPmnUdsLqqRXlHQ1RT5w bf85db9e916dceaf84a1e6ea33c59eb4adb424cb4e727ce0a903a3498b799ed2', u'name': u'skrah'}, 'item': {u'ovpn': u'skrah@thinkpad IboPmnUdsLqqRXlHQ1RT5w bf85db9e916dceaf84a1e6ea33c59eb4adb424cb4e727ce0a903a3498b799ed2', u'name': u'skrah'}, u'changed': False, u'name': u'skrah', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'skrah', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1187, u'groups': u'sudo', u'home': u'/home/smanjara', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1187, '_ansible_item_label': {u'ovpn': u'smanjara@fedora 5oorMoVYD3sT0nmOTBDh9w 83be007f68694c9463ef46e4ce223221d639d78f11d5b68449598de77e8e0ce8', u'name': u'smanjara'}, 'item': {u'ovpn': u'smanjara@fedora 5oorMoVYD3sT0nmOTBDh9w 83be007f68694c9463ef46e4ce223221d639d78f11d5b68449598de77e8e0ce8', u'name': u'smanjara'}, u'changed': False, u'name': u'smanjara', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'smanjara', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1188, u'groups': u'sudo', u'home': u'/home/bengland', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1188, '_ansible_item_label': {u'ovpn': u'bengland@bene-laptop ud1gWgoNggJTS7LQtPTZTA d3ebd084ce385cb450ce2f83c02dc66a1637dedbc7a8b191dab68acfc935af41', u'name': u'bengland'}, 'item': {u'ovpn': u'bengland@bene-laptop ud1gWgoNggJTS7LQtPTZTA d3ebd084ce385cb450ce2f83c02dc66a1637dedbc7a8b191dab68acfc935af41', u'name': u'bengland'}, u'changed': False, u'name': u'bengland', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'bengland', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1189, u'groups': u'sudo', u'home': u'/home/pnawracay', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1189, '_ansible_item_label': {u'ovpn': u'pnawracay@flab LIurEXLa7xbr+jzf2wRJVg 04062ba385602b385fd17f14de3a0cad83c685b8078fd2f18cc9ad77a4f4762d', u'name': u'pnawracay'}, 'item': {u'ovpn': u'pnawracay@flab LIurEXLa7xbr+jzf2wRJVg 04062ba385602b385fd17f14de3a0cad83c685b8078fd2f18cc9ad77a4f4762d', u'name': u'pnawracay'}, u'changed': False, u'name': u'pnawracay', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pnawracay', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1190, u'groups': u'sudo', u'home': u'/home/alfonsomthd', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1190, '_ansible_item_label': {u'ovpn': u'alfonsomthd@localhost eUvx+7UEx5IYGmS0lNIscQ 2f1bfd4874280b9f525a46e93e767504df80a9b09a83a2fea387dcd6e34bc0f8', u'name': u'alfonsomthd'}, 'item': {u'ovpn': u'alfonsomthd@localhost eUvx+7UEx5IYGmS0lNIscQ 2f1bfd4874280b9f525a46e93e767504df80a9b09a83a2fea387dcd6e34bc0f8', u'name': u'alfonsomthd'}, u'changed': False, u'name': u'alfonsomthd', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'alfonsomthd', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1191, u'groups': u'sudo', u'home': u'/home/oliveiradan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1191, '_ansible_item_label': {u'ovpn': u'oliveiradan@opensuse-dev twDqMYwYsdYmbgyCpUnElw ec3ee80ddc747c3ca5e1455a122279f8e1e642c5c09aa9c2ca7fec142f55089e', u'name': u'oliveiradan'}, 'item': {u'ovpn': u'oliveiradan@opensuse-dev twDqMYwYsdYmbgyCpUnElw ec3ee80ddc747c3ca5e1455a122279f8e1e642c5c09aa9c2ca7fec142f55089e', u'name': u'oliveiradan'}, u'changed': False, u'name': u'oliveiradan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'oliveiradan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1192, u'groups': u'sudo', u'home': u'/home/swagner', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1192, '_ansible_item_label': {u'ovpn': u'swagner@ubuntu 64V1h0Se0FmBQNH7KLibbQ ad7c91e9e2f7f3999492d5e41fbbc993327d37929bd09606227367d75e5556ba', u'name': u'swagner'}, 'item': {u'ovpn': u'swagner@ubuntu 64V1h0Se0FmBQNH7KLibbQ ad7c91e9e2f7f3999492d5e41fbbc993327d37929bd09606227367d75e5556ba', u'name': u'swagner'}, u'changed': False, u'name': u'swagner', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'swagner', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1193, u'groups': u'sudo', u'home': u'/home/yuvalif', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1193, '_ansible_item_label': {u'ovpn': u'ylifshit@localhost dyc2NU2pMz8NF8/uR2kMxA 3a6f1f9e55b5116f74d01ffbabdc339054088d257a16cf9fafcfe05b27fa678e', u'name': u'yuvalif'}, 'item': {u'ovpn': u'ylifshit@localhost dyc2NU2pMz8NF8/uR2kMxA 3a6f1f9e55b5116f74d01ffbabdc339054088d257a16cf9fafcfe05b27fa678e', u'name': u'yuvalif'}, u'changed': False, u'name': u'yuvalif', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yuvalif', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1194, u'groups': u'sudo', u'home': u'/home/kkeithle', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1194, '_ansible_item_label': {u'ovpn': u'kkeithle@kkeithle.usersys.redhat.com FPnVevv1sp5hlWoJeDCe/g e5a1fa7ccf678b91ed570983d5420c98f109d507442c8e4dcd50803e0d71c852', u'name': u'kkeithle'}, 'item': {u'ovpn': u'kkeithle@kkeithle.usersys.redhat.com FPnVevv1sp5hlWoJeDCe/g e5a1fa7ccf678b91ed570983d5420c98f109d507442c8e4dcd50803e0d71c852', u'name': u'kkeithle'}, u'changed': False, u'name': u'kkeithle', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kkeithle', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1195, u'groups': u'sudo', u'home': u'/home/emmericp', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1195, '_ansible_item_label': {u'ovpn': u'paul.emmerich@croit.io RN4hOorvA7irUg/3ViM9KQ 3bd06194186d2624cadf255fa1c38ddf7dded0a6d83dc6001cd55fcc0a899130', u'name': u'emmericp'}, 'item': {u'ovpn': u'paul.emmerich@croit.io RN4hOorvA7irUg/3ViM9KQ 3bd06194186d2624cadf255fa1c38ddf7dded0a6d83dc6001cd55fcc0a899130', u'name': u'emmericp'}, u'changed': False, u'name': u'emmericp', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi178', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'emmericp', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1196, u'groups': u'sudo', u'home': u'/home/mchangir', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1196, '_ansible_item_label': {u'ovpn': u'mchangir@indraprastha junlGKNU/xzt4OIaGHKBLA e8c67fd935fca490af3fe17453ccae3176268c4bfe1db4a2a879a2ab7ea6bfa5', u'name': u'mchangir'}, 'item': {u'ovpn': u'mchangir@indraprastha junlGKNU/xzt4OIaGHKBLA e8c67fd935fca490af3fe17453ccae3176268c4bfe1db4a2a879a2ab7ea6bfa5', u'name': u'mchangir'}, u'changed': False, u'name': u'mchangir', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mchangir', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1197, u'groups': u'sudo', u'home': u'/home/sidharthanup', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1197, '_ansible_item_label': {u'ovpn': u'sidharthanup@strawberryfields IZq91vqA+RG1Rtn3JFZb6Q a2873481cac9b8b4a0bd8bebe0248b3dccb370dd18b56a4dae713ca1fb0c4286', u'name': u'sidharthanup'}, 'item': {u'ovpn': u'sidharthanup@strawberryfields IZq91vqA+RG1Rtn3JFZb6Q a2873481cac9b8b4a0bd8bebe0248b3dccb370dd18b56a4dae713ca1fb0c4286', u'name': u'sidharthanup'}, u'changed': False, u'name': u'sidharthanup', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sidharthanup', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1198, u'groups': u'sudo', u'home': u'/home/varsha', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1198, '_ansible_item_label': {u'ovpn': u'varsha@local q7QjtBqj3duVVKubHLpzjw a358a0d6cd132a451a910abcbcf3070e4144c92638e0487622ae040a3410c07f', u'name': u'varsha'}, 'item': {u'ovpn': u'varsha@local q7QjtBqj3duVVKubHLpzjw a358a0d6cd132a451a910abcbcf3070e4144c92638e0487622ae040a3410c07f', u'name': u'varsha'}, u'changed': False, u'name': u'varsha', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'varsha', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1199, u'groups': u'sudo', u'home': u'/home/sjust', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1199, '_ansible_item_label': {u'ovpn': u'sjust@pondermatic sCd6606QpID1PnHn0AxFag 46da7d4c77cb1238f83a34f99e774707110f997d88f3a0fd240aac9b7b7bbc85\nsjust@rex004 YS0in6YtQJHx5aUo7ZHi8Q bdd5977d05171a365539b19fae283ec2e3c7389516664692b9bbbaf98c7b61f4\nsjust@office w19UilyC/xu7uCzv0DnWRg ab20efc114b769bf4c2cf313eb30db09c2e2f8234992f120cfc3d1b8b347ed3c\nsam@deepthought 44sCi+GEfY0zjKo5M/4FiQ ed1eedd14ca68116a2000477fa078f8f736d0a15640723c32204bb30f14cb888\n', u'name': u'sjust'}, 'item': {u'ovpn': u'sjust@pondermatic sCd6606QpID1PnHn0AxFag 46da7d4c77cb1238f83a34f99e774707110f997d88f3a0fd240aac9b7b7bbc85\nsjust@rex004 YS0in6YtQJHx5aUo7ZHi8Q bdd5977d05171a365539b19fae283ec2e3c7389516664692b9bbbaf98c7b61f4\nsjust@office w19UilyC/xu7uCzv0DnWRg ab20efc114b769bf4c2cf313eb30db09c2e2f8234992f120cfc3d1b8b347ed3c\nsam@deepthought 44sCi+GEfY0zjKo5M/4FiQ ed1eedd14ca68116a2000477fa078f8f736d0a15640723c32204bb30f14cb888\n', u'name': u'sjust'}, u'changed': False, u'name': u'sjust', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sjust', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1200, u'groups': u'sudo', u'home': u'/home/ideepika', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1200, '_ansible_item_label': {u'ovpn': u'deepika@asus hHsL1xugca0LzY52gKqfqQ 312e0f2680f72d9459c707fcd0ccfb777617f00017f0511839e9b7e3167d590f', u'name': u'ideepika'}, 'item': {u'ovpn': u'deepika@asus hHsL1xugca0LzY52gKqfqQ 312e0f2680f72d9459c707fcd0ccfb777617f00017f0511839e9b7e3167d590f', u'name': u'ideepika'}, u'changed': False, u'name': u'ideepika', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ideepika', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1201, u'groups': u'sudo', u'home': u'/home/gsalomon', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1201, '_ansible_item_label': {u'ovpn': u'gsalomon@home BrlCVxStNlEZGf5yYYAz1w 4b0c8d5b57dae1328c16d3017b59b632ccdfebe4135209fa97748c70ff00cc46', u'name': u'gsalomon'}, 'item': {u'ovpn': u'gsalomon@home BrlCVxStNlEZGf5yYYAz1w 4b0c8d5b57dae1328c16d3017b59b632ccdfebe4135209fa97748c70ff00cc46', u'name': u'gsalomon'}, u'changed': False, u'name': u'gsalomon', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gsalomon', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1202, u'groups': u'sudo', u'home': u'/home/soumyakoduri', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1202, '_ansible_item_label': {u'ovpn': u'skoduri@rgw_vm fAskaxmJYtm4TDcModIkrQ 884b57c3b5d56493da361dc9f1cc4e06e766628fcb0f916090f2096edc5ce7de', u'name': u'soumyakoduri'}, 'item': {u'ovpn': u'skoduri@rgw_vm fAskaxmJYtm4TDcModIkrQ 884b57c3b5d56493da361dc9f1cc4e06e766628fcb0f916090f2096edc5ce7de', u'name': u'soumyakoduri'}, u'changed': False, u'name': u'soumyakoduri', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'soumyakoduri', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1203, u'groups': u'sudo', u'home': u'/home/kyr', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1203, '_ansible_item_label': {u'ovpn': u'kyr@suse Xu4+Yi98Il4ETOAav6okqA 03bc46e5fac6346cd82ff681f756860f98e0c61168633ce23325efde11a1964a', u'name': u'kyr'}, 'item': {u'ovpn': u'kyr@suse Xu4+Yi98Il4ETOAav6okqA 03bc46e5fac6346cd82ff681f756860f98e0c61168633ce23325efde11a1964a', u'name': u'kyr'}, u'changed': False, u'name': u'kyr', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kyr', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1204, u'groups': u'sudo', u'home': u'/home/sseshasa', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1204, '_ansible_item_label': {u'ovpn': u'sseshasa@thinkpad jyB1pr0I3qsDknkTnJTMjg 72ac1456e344c22fd940d0ba0e035aa3819ef7cd3891e53308aa92ba2dec8849', u'name': u'sseshasa'}, 'item': {u'ovpn': u'sseshasa@thinkpad jyB1pr0I3qsDknkTnJTMjg 72ac1456e344c22fd940d0ba0e035aa3819ef7cd3891e53308aa92ba2dec8849', u'name': u'sseshasa'}, u'changed': False, u'name': u'sseshasa', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sseshasa', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1205, u'groups': u'sudo', u'home': u'/home/rfriedma', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1205, '_ansible_item_label': {u'ovpn': u'rfriedma@rflap.redhat.com 5+OUPoyz8K0M0kcymdQOjA 40ce705001f31d7156c965228938cd4b02ae1a2c43dac1bbcd1b538e70312189', u'name': u'rfriedma'}, 'item': {u'ovpn': u'rfriedma@rflap.redhat.com 5+OUPoyz8K0M0kcymdQOjA 40ce705001f31d7156c965228938cd4b02ae1a2c43dac1bbcd1b538e70312189', u'name': u'rfriedma'}, u'changed': False, u'name': u'rfriedma', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rfriedma', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 10102, '_ansible_no_log': False, u'create_home': True, u'groups': u'sudo', u'home': u'/home/lmb', '_ansible_item_label': {u'ovpn': u'lmb@hermes LMS8kAikL0iqw2S6IbXa3Q f57a493b31e7ed02a2563dd4295278d4842dc698b4c635d011a8d2b4b1fd5c2b', u'name': u'lmb'}, 'ansible_loop_var': u'item', u'group': 10103, u'name': u'lmb', 'item': {u'ovpn': u'lmb@hermes LMS8kAikL0iqw2S6IbXa3Q f57a493b31e7ed02a2563dd4295278d4842dc698b4c635d011a8d2b4b1fd5c2b', u'name': u'lmb'}, u'changed': True, u'system': False, 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'lmb', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 10103, '_ansible_no_log': False, u'create_home': True, u'groups': u'sudo', u'home': u'/home/ksirivad', '_ansible_item_label': {u'ovpn': u'ksirivad@ksirivad prZzE08FqnY6hFtGk0Q6XQ 2ef0878b0050cf28775813fe0991f9a746c07e61920280ce29ee69088eda5efc', u'name': u'ksirivad'}, 'ansible_loop_var': u'item', u'group': 10104, u'name': u'ksirivad', 'item': {u'ovpn': u'ksirivad@ksirivad prZzE08FqnY6hFtGk0Q6XQ 2ef0878b0050cf28775813fe0991f9a746c07e61920280ce29ee69088eda5efc', u'name': u'ksirivad'}, u'changed': True, u'system': False, 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi155', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ksirivad', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'')Failure object was: {'smithi168.front.sepia.ceph.com': {'msg': u'All items completed', 'changed': True, 'results': [{u'comment': u'', u'shell': u'/bin/bash', u'uid': 1101, u'groups': u'sudo', u'home': u'/home/andrewschoen', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1101, '_ansible_item_label': {u'ovpn': u'andrewschoen@home hYDlX2i7X8t0JHpK2smvFA 2a6e753fb3c401ad75e1273ae5e18e4abc0404149dbdc3675bb317a63a4469e9', u'name': u'andrewschoen'}, 'item': {u'ovpn': u'andrewschoen@home hYDlX2i7X8t0JHpK2smvFA 2a6e753fb3c401ad75e1273ae5e18e4abc0404149dbdc3675bb317a63a4469e9', u'name': u'andrewschoen'}, u'changed': False, u'name': u'andrewschoen', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'andrewschoen', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1102, u'groups': u'sudo', u'home': u'/home/zack', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1102, '_ansible_item_label': {u'ovpn': u'zack@ceph0 1CeZnO8st4CuhLpH/DJNVw 7230fda728e34998da4401284fc3e5c8939bfdd52482f8cb675d8acf6fcd12b4\nzack@mba.local mzf7/v4tv3uVExt16N/w/Q 13dcf7082b974315dfab1c76885dfc094d7ccaf01ae2e7898a20005602be2caa\n', u'name': u'zack'}, 'item': {u'ovpn': u'zack@ceph0 1CeZnO8st4CuhLpH/DJNVw 7230fda728e34998da4401284fc3e5c8939bfdd52482f8cb675d8acf6fcd12b4\nzack@mba.local mzf7/v4tv3uVExt16N/w/Q 13dcf7082b974315dfab1c76885dfc094d7ccaf01ae2e7898a20005602be2caa\n', u'name': u'zack'}, u'changed': False, u'name': u'zack', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'zack', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {'item': {u'ovpn': u'kdreyer@redhat.com yUBHWOGnPts8xuDSon5ODg 38feb4fec9f1ba3de74ed074d745ed46a8e5b3ac1d44e50005a04cf07f0462cd', u'name': u'kdreyer'}, 'ansible_loop_var': u'item', '_ansible_item_label': {u'ovpn': u'kdreyer@redhat.com yUBHWOGnPts8xuDSon5ODg 38feb4fec9f1ba3de74ed074d745ed46a8e5b3ac1d44e50005a04cf07f0462cd', u'name': u'kdreyer'}, 'msg': u'SSH Error: data could not be sent to remote host "smithi168.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': {u'ovpn': u'alfredo@papaya.local V5VkRT2Du88WwQrYeK196w 4dff4d9bdc4d9ab60532814c338abe4534cb45a9ca729bdbc2411ad513516b47', u'name': u'adeza'}, 'ansible_loop_var': u'item', '_ansible_item_label': {u'ovpn': u'alfredo@papaya.local V5VkRT2Du88WwQrYeK196w 4dff4d9bdc4d9ab60532814c338abe4534cb45a9ca729bdbc2411ad513516b47', u'name': u'adeza'}, 'msg': u'SSH Error: data could not be sent to remote host "smithi168.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': {u'ovpn': u'dmick@angus z8fyazJ0pc56I/XTlHAOrQ db682b1b89c790bf26a2e12dbea03f4495f8ea5f248e29b2f5c5dc072acb3c93\ndmick@home o9fblAdxhBToEJxbutv2mw 1704b16322c3984ed565b48cce52877eadade26b4093443b349b70f846f0eac6\ndmick@dell r6meTomZVVH/WVOmVf2aSQ 8ac1a9d0452166af59894c6b57ed07a12b42042aa6d2618df2bec9c962684072\ndmick@dmick-MacBookAir EFN6/5V1I5YNm72XjlApUg f2335a3fa49a0296233ab5faba28e7bbf309a89c193169dbe3c411e4219827a4\n', u'name': u'dmick'}, 'ansible_loop_var': u'item', '_ansible_item_label': {u'ovpn': u'dmick@angus z8fyazJ0pc56I/XTlHAOrQ db682b1b89c790bf26a2e12dbea03f4495f8ea5f248e29b2f5c5dc072acb3c93\ndmick@home o9fblAdxhBToEJxbutv2mw 1704b16322c3984ed565b48cce52877eadade26b4093443b349b70f846f0eac6\ndmick@dell r6meTomZVVH/WVOmVf2aSQ 8ac1a9d0452166af59894c6b57ed07a12b42042aa6d2618df2bec9c962684072\ndmick@dmick-MacBookAir EFN6/5V1I5YNm72XjlApUg f2335a3fa49a0296233ab5faba28e7bbf309a89c193169dbe3c411e4219827a4\n', u'name': u'dmick'}, 'msg': u'SSH Error: data could not be sent to remote host "smithi168.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {'item': {u'ovpn': u'sage@autriche 2Uhfc/1jXje4W7G4s0sJEQ 09631754d059fab96e16be441b3b8f7e63b37a260677897a2669cf7a2501d2d3\nsage@gnit YuLp2iihT6cCqY+GmW9IVQ dc79b19c45b9be1b4e69cd7debeb5b87aaa8cd0050353f798913da22c233875a\nsage@maetl wpsQIdtAb+z+ietqr+90Zw 42c518015b7cd440e7c8afd65fcca5769e31e9ef8f8a67d109a3de38187087e6\nsage@lowride tak6ngfj0ohW6upHRUMq7g 450919c617f452e874ef2a8cfa5ca18f47d249450ef0b0374849e721b0c53a70\n', u'name': u'sage'}, 'ansible_loop_var': u'item', '_ansible_item_label': {u'ovpn': u'sage@autriche 2Uhfc/1jXje4W7G4s0sJEQ 09631754d059fab96e16be441b3b8f7e63b37a260677897a2669cf7a2501d2d3\nsage@gnit YuLp2iihT6cCqY+GmW9IVQ dc79b19c45b9be1b4e69cd7debeb5b87aaa8cd0050353f798913da22c233875a\nsage@maetl wpsQIdtAb+z+ietqr+90Zw 42c518015b7cd440e7c8afd65fcca5769e31e9ef8f8a67d109a3de38187087e6\nsage@lowride tak6ngfj0ohW6upHRUMq7g 450919c617f452e874ef2a8cfa5ca18f47d249450ef0b0374849e721b0c53a70\n', u'name': u'sage'}, 'msg': u'SSH Error: data could not be sent to remote host "smithi168.front.sepia.ceph.com". Make sure this host can be reached over ssh', 'unreachable': True}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1107, u'groups': u'sudo', u'home': u'/home/dgalloway', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1107, '_ansible_item_label': {u'ovpn': u'dgallowa@laptop POebac0SJIVkSCyi0LE4yg 52fc0cb34f3332a5f3cff117a20f00e8ea89a051736cb543a689a96dd9528ae5\ndgalloway@p50 7PHdbcNE/YjcHOm5NbNfpQ 7075211c35997b01797c2f15083aedf13888fc4c54080130342dccc76469fb40\n', u'name': u'dgalloway'}, 'item': {u'ovpn': u'dgallowa@laptop POebac0SJIVkSCyi0LE4yg 52fc0cb34f3332a5f3cff117a20f00e8ea89a051736cb543a689a96dd9528ae5\ndgalloway@p50 7PHdbcNE/YjcHOm5NbNfpQ 7075211c35997b01797c2f15083aedf13888fc4c54080130342dccc76469fb40\n', u'name': u'dgalloway'}, u'changed': False, u'name': u'dgalloway', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'dgalloway', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 10102, '_ansible_no_log': False, u'create_home': True, u'groups': u'sudo', u'home': u'/home/akraitma', '_ansible_item_label': {u'ovpn': u'akraitma@aklap 2zvxeeQ+bQ4UhKBE9Kz2DQ ff4a9dbab841ee62e7328d8cd0d285c68c22a1de6e23ea24f470b3dcb1a27045', u'name': u'akraitma'}, 'ansible_loop_var': u'item', u'group': 10103, u'name': u'akraitma', 'item': {u'ovpn': u'akraitma@aklap 2zvxeeQ+bQ4UhKBE9Kz2DQ ff4a9dbab841ee62e7328d8cd0d285c68c22a1de6e23ea24f470b3dcb1a27045', u'name': u'akraitma'}, u'changed': True, u'system': False, 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'akraitma', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1108, u'groups': u'sudo', u'home': u'/home/trociny', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1108, '_ansible_item_label': {u'ovpn': u'mgolub@mirantis.com +WMPxNCgiMwhNJAJHNCV2g 4cd6749ac740ea6dfb50d4f602920ac75e979286439bd510540a8a551728104e', u'name': u'trociny'}, 'item': {u'ovpn': u'mgolub@mirantis.com +WMPxNCgiMwhNJAJHNCV2g 4cd6749ac740ea6dfb50d4f602920ac75e979286439bd510540a8a551728104e', u'name': u'trociny'}, u'changed': False, u'name': u'trociny', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'trociny', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1109, u'groups': u'sudo', u'home': u'/home/smithfarm', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1109, '_ansible_item_label': {u'ovpn': u'smithfarm@wilbur vEk7281ao7DkpXE2Cn3rpw 20058c04c4991caf38b9517faedb75c8562e8d8669808b6be9f70da6a5b64007', u'name': u'smithfarm', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC6jqQtqO7pnW39mpSQQC4L2FuxPOx/5j34lp2PrwuTWJoDwtUVCZhpNwSs5UAj3wdXGXCzWYWxu9d9Gh0UEBkiC+2ZHtLvWrwS1TDVRy5g5jYVm9vX3kL5dhoC2vYPyQoQkxeMX3BYlhLwkSjCAnjaceJJZyOT7KsqbpanvJxvudMzeyCosntnMuxyPzjH9CkVKqycddZnscdHMkDIZ3TjNhnL8GKK/QCPB60tkCO6HWN/g4HxidRjrp2VjQgIa5MWN9LBR6mGspZI6+rR1ygLhbl1scr1cCCumh65r+zmCC/ZY33YCfRXdmQZo+ShsBa+KbcvakEx0Bqgx7DsMoiR ncutler@pantograf'}, 'item': {u'ovpn': u'smithfarm@wilbur vEk7281ao7DkpXE2Cn3rpw 20058c04c4991caf38b9517faedb75c8562e8d8669808b6be9f70da6a5b64007', u'name': u'smithfarm', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC6jqQtqO7pnW39mpSQQC4L2FuxPOx/5j34lp2PrwuTWJoDwtUVCZhpNwSs5UAj3wdXGXCzWYWxu9d9Gh0UEBkiC+2ZHtLvWrwS1TDVRy5g5jYVm9vX3kL5dhoC2vYPyQoQkxeMX3BYlhLwkSjCAnjaceJJZyOT7KsqbpanvJxvudMzeyCosntnMuxyPzjH9CkVKqycddZnscdHMkDIZ3TjNhnL8GKK/QCPB60tkCO6HWN/g4HxidRjrp2VjQgIa5MWN9LBR6mGspZI6+rR1ygLhbl1scr1cCCumh65r+zmCC/ZY33YCfRXdmQZo+ShsBa+KbcvakEx0Bqgx7DsMoiR ncutler@pantograf'}, u'changed': False, u'name': u'smithfarm', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'smithfarm', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1110, u'groups': u'sudo', u'home': u'/home/mbenjamin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1110, '_ansible_item_label': {u'ovpn': u'mbenjamin@ferret N7Ln6KHUAYT/iiP4Swkb0A 244b90574d7e294f6f80949ef5e640bab103c2c34e1902a69cdcf592607d950c\nmbenjamin@duel PRRFsQiomHSJX8sVgRHblA f3009adb7502a6c5554e1157d36c63d84f40fe5074a01d7d0280e5c48da374dc\nmbenjamin@rael 8ixfNNx3j2m2+nGhXkHsnQ 62fda7ec724f0fdb3f05add4c3739e07a5c7eeb2380d5f1f7eeeb13db35e16a8\nmbenjamin@shatner hJ1oEl5wSZaKPJC1Ugss8g 9c9161b53da4d842308f80e5b490a0c1539076f129bd0e34ee2253272dd55c21\n', u'name': u'mbenjamin'}, 'item': {u'ovpn': u'mbenjamin@ferret N7Ln6KHUAYT/iiP4Swkb0A 244b90574d7e294f6f80949ef5e640bab103c2c34e1902a69cdcf592607d950c\nmbenjamin@duel PRRFsQiomHSJX8sVgRHblA f3009adb7502a6c5554e1157d36c63d84f40fe5074a01d7d0280e5c48da374dc\nmbenjamin@rael 8ixfNNx3j2m2+nGhXkHsnQ 62fda7ec724f0fdb3f05add4c3739e07a5c7eeb2380d5f1f7eeeb13db35e16a8\nmbenjamin@shatner hJ1oEl5wSZaKPJC1Ugss8g 9c9161b53da4d842308f80e5b490a0c1539076f129bd0e34ee2253272dd55c21\n', u'name': u'mbenjamin'}, u'changed': False, u'name': u'mbenjamin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mbenjamin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1111, u'groups': u'sudo', u'home': u'/home/aemerson', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1111, '_ansible_item_label': {u'ovpn': u'aemerson@seasalt enqtMOCQ0a24hizRK46SZg 42f36c9c14c1eb7c468e41cf4c5649e30037f0aaf5eefbd6c07be9637224ca01', u'name': u'aemerson', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCxUDk9jEQOSrRwe0WMGBPw28fzgOhvqMgcG90SvuUjR20Rx3KUfy9JMxPJPesjBQz0xgH5adOVg2JQeLrt3uSJdIFNpqox8BBrS1u/bWT2dorQjLnFEhrtA8Ae/h2kwLQ8w3yYNWB/PxDSuMRnyH4G8EWtVydzQcV/QnNZ9Y6eBcHaI5C2yVnFl7Fi4zBYc2eeL8o8gWqmM6DobBCvVrxD0pCNAsLhOW6IA140BNGT9F/hQZYje9RQRWbFKh/iKiUhqFYCzLcxFfYXkw2HZMJA2p/bLmuc8ZbgYaIiU6b90kpfDB37Xw0S6toIj9E8h+E3nkNnwraCQcbralhz/bdz aemerson@seasalt'}, 'item': {u'ovpn': u'aemerson@seasalt enqtMOCQ0a24hizRK46SZg 42f36c9c14c1eb7c468e41cf4c5649e30037f0aaf5eefbd6c07be9637224ca01', u'name': u'aemerson', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCxUDk9jEQOSrRwe0WMGBPw28fzgOhvqMgcG90SvuUjR20Rx3KUfy9JMxPJPesjBQz0xgH5adOVg2JQeLrt3uSJdIFNpqox8BBrS1u/bWT2dorQjLnFEhrtA8Ae/h2kwLQ8w3yYNWB/PxDSuMRnyH4G8EWtVydzQcV/QnNZ9Y6eBcHaI5C2yVnFl7Fi4zBYc2eeL8o8gWqmM6DobBCvVrxD0pCNAsLhOW6IA140BNGT9F/hQZYje9RQRWbFKh/iKiUhqFYCzLcxFfYXkw2HZMJA2p/bLmuc8ZbgYaIiU6b90kpfDB37Xw0S6toIj9E8h+E3nkNnwraCQcbralhz/bdz aemerson@seasalt'}, u'changed': False, u'name': u'aemerson', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'aemerson', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1112, u'groups': u'sudo', u'home': u'/home/sbillah', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1112, '_ansible_item_label': {u'ovpn': u'sbillah@syed-machine qVcw+LuFQQxYW7QpzZ3aLA d028c4635289a781f3ebe26a545e084572613b03cc9cde7770018ad0259e4dc9', u'name': u'sbillah', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDEgeHHS5jx9k7QdOEGqZjaxEAPJ6vv/WZXJNpifBpm9Ba1FRA1U3qHV8oX/bBZ08HIBGg8hQOeZ5I7d5HyzR0971W6KVgDF+s6mRN7d+heNi3XmikbJrozLoEiVQNHIsXjUoc655c2y9NR9Lf5FBweSSrbE34jCUqTA3XmZOdbkjY+ngOcDIfNixRG0yZ57p6UqYW0I+Mg68CB7N+Lv4gFvH/968aML7ConABPGs+vnLdNSQbjuibnaoZwzeSgPoaBJEqBCgNkwO8TyaC04okMj2X7/FGxgZNhwF0V5SVpBllWlGqdAigEF0dher88PbzSIFSm/x8PeACSZWkU0QWV Masum@MASUM-PC'}, 'item': {u'ovpn': u'sbillah@syed-machine qVcw+LuFQQxYW7QpzZ3aLA d028c4635289a781f3ebe26a545e084572613b03cc9cde7770018ad0259e4dc9', u'name': u'sbillah', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDEgeHHS5jx9k7QdOEGqZjaxEAPJ6vv/WZXJNpifBpm9Ba1FRA1U3qHV8oX/bBZ08HIBGg8hQOeZ5I7d5HyzR0971W6KVgDF+s6mRN7d+heNi3XmikbJrozLoEiVQNHIsXjUoc655c2y9NR9Lf5FBweSSrbE34jCUqTA3XmZOdbkjY+ngOcDIfNixRG0yZ57p6UqYW0I+Mg68CB7N+Lv4gFvH/968aML7ConABPGs+vnLdNSQbjuibnaoZwzeSgPoaBJEqBCgNkwO8TyaC04okMj2X7/FGxgZNhwF0V5SVpBllWlGqdAigEF0dher88PbzSIFSm/x8PeACSZWkU0QWV Masum@MASUM-PC'}, u'changed': False, u'name': u'sbillah', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sbillah', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1113, u'groups': u'sudo', u'home': u'/home/amaredia', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1113, '_ansible_item_label': {u'ovpn': u'ali@freerunner yQjRpRVG7D5KN2HAUjI30g 9d677a34ae98477e6cc8ba1d975d81dcae43a102013b265c63f3ea91e7dacd78', u'name': u'amaredia', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCc1ZbHNXuJn7gxyt7QJ579gEM13MuFdTzsrnedYuRIW2Zlm4rCFr8Oj5SGs5DWBIJnd3W4O2v7PjuVQMxU/zbwnj7mdwBmLFe1cSzOJv2eP1R2uaU5z5C7KNmyPLU++pGKClzb6v5wcEQUq2K35xnuXUU9B935dK+Fm7bK7+HAxj+1vpVeycbPFyPhf6mwbx8dZv4uvZGV2+CGBuyIB/5U2AMJZy9LWim3AR35bip4ftXvSKlAON+RHhnS0toG/6uwp0XlFuGn5H8snaca7L6hGtB4xg1PqA5aMf33Jiv2NVLQo8emHU9J/HeNVS7ksoSZ6InynpLZ6b9uXa9OM9XL ali@parkour'}, 'item': {u'ovpn': u'ali@freerunner yQjRpRVG7D5KN2HAUjI30g 9d677a34ae98477e6cc8ba1d975d81dcae43a102013b265c63f3ea91e7dacd78', u'name': u'amaredia', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCc1ZbHNXuJn7gxyt7QJ579gEM13MuFdTzsrnedYuRIW2Zlm4rCFr8Oj5SGs5DWBIJnd3W4O2v7PjuVQMxU/zbwnj7mdwBmLFe1cSzOJv2eP1R2uaU5z5C7KNmyPLU++pGKClzb6v5wcEQUq2K35xnuXUU9B935dK+Fm7bK7+HAxj+1vpVeycbPFyPhf6mwbx8dZv4uvZGV2+CGBuyIB/5U2AMJZy9LWim3AR35bip4ftXvSKlAON+RHhnS0toG/6uwp0XlFuGn5H8snaca7L6hGtB4xg1PqA5aMf33Jiv2NVLQo8emHU9J/HeNVS7ksoSZ6InynpLZ6b9uXa9OM9XL ali@parkour'}, u'changed': False, u'name': u'amaredia', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'amaredia', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1114, u'groups': u'sudo', u'home': u'/home/tserlin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1114, '_ansible_item_label': {u'ovpn': u'tserlin@annarbor DlKe+OWBPcFAQtWMUAHnwg 6b268bd737ffa5dd38865575ccd444b92cb912c70f5b82dac41f9c50505df4a5', u'name': u'tserlin', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA2ok6CBUpOVGv2RFws44GtEP5SVxOi4Vie0WSZYLpD55rTfmOtsItN1d1EciVNTUyWuyzMeQGWC4JAd3/2l3gR/5ZwSvd7b/7TCNYxjAMEubRand0GxEoiKhpkJMMmJqcT0KefP8pr31MASWPuERj1+0/IbjJExsvrJaUjqeIfZ+DWR8dC2VYdcH3hsp6AE3mqKX/9693sxe8ROt6qY4WkpZcO4M90unOVa2CnJsYqKaaIC4z3fmKuHZpJZjiJMrg8rtuN4r7bnKWPEVGcahj+i74JWwKR5+2gntLpxw2chIBmf4qFu6HDplddig4V3I/2NLB8soBpgc+m8O7YyYl0w== thomas@easystreet'}, 'item': {u'ovpn': u'tserlin@annarbor DlKe+OWBPcFAQtWMUAHnwg 6b268bd737ffa5dd38865575ccd444b92cb912c70f5b82dac41f9c50505df4a5', u'name': u'tserlin', u'key': u'ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA2ok6CBUpOVGv2RFws44GtEP5SVxOi4Vie0WSZYLpD55rTfmOtsItN1d1EciVNTUyWuyzMeQGWC4JAd3/2l3gR/5ZwSvd7b/7TCNYxjAMEubRand0GxEoiKhpkJMMmJqcT0KefP8pr31MASWPuERj1+0/IbjJExsvrJaUjqeIfZ+DWR8dC2VYdcH3hsp6AE3mqKX/9693sxe8ROt6qY4WkpZcO4M90unOVa2CnJsYqKaaIC4z3fmKuHZpJZjiJMrg8rtuN4r7bnKWPEVGcahj+i74JWwKR5+2gntLpxw2chIBmf4qFu6HDplddig4V3I/2NLB8soBpgc+m8O7YyYl0w== thomas@easystreet'}, u'changed': False, u'name': u'tserlin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'tserlin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1115, u'groups': u'sudo', u'home': u'/home/dis', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1115, '_ansible_item_label': {u'ovpn': u'dis@zambezi wXYUFvWPBlkCFz+mC4RD6A 11c464dfb2a27986e029f1915732a4f237baba4eade02bb045c8f0d13dfada28', u'name': u'dis'}, 'item': {u'ovpn': u'dis@zambezi wXYUFvWPBlkCFz+mC4RD6A 11c464dfb2a27986e029f1915732a4f237baba4eade02bb045c8f0d13dfada28', u'name': u'dis'}, u'changed': False, u'name': u'dis', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'dis', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1116, u'groups': u'sudo', u'home': u'/home/gregf', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1116, '_ansible_item_label': {u'ovpn': u'gregf@kai YhNrPfedhZjbvGjOfmotOA 440cf8595a87cd307790bbf79c3668455a0405945e2b271d873325de222cd72f\ngregf@pudgy VZrk8nWzg7pYLOrZru8dBg c1d1a0e469a134ccf5f5a5525631a6b83efa6970beec3b23809eb0daa3dca47f\ngfarnum@WorkMini2 +bAqcTdU7Ok9bGMcB3A84w 3cff1326561a23cf81dd6495373cb83ed149cee026c6374d72c19b483f4f1f07\ngfarnum@Macbook bxVtolCC9SY3QNlpx3cE1w aff8d28bfb4d693253511d29e8d399196e964fc096594ec705748a5469d44654\ngregf@fedoragreg Jdn8I/sFGcr5Aa/dici6lw 50f88afc35c05ef8454742226f7baf2cd20cb1e2d4d0c9f4a393013877736bfa\n', u'name': u'gregf'}, 'item': {u'ovpn': u'gregf@kai YhNrPfedhZjbvGjOfmotOA 440cf8595a87cd307790bbf79c3668455a0405945e2b271d873325de222cd72f\ngregf@pudgy VZrk8nWzg7pYLOrZru8dBg c1d1a0e469a134ccf5f5a5525631a6b83efa6970beec3b23809eb0daa3dca47f\ngfarnum@WorkMini2 +bAqcTdU7Ok9bGMcB3A84w 3cff1326561a23cf81dd6495373cb83ed149cee026c6374d72c19b483f4f1f07\ngfarnum@Macbook bxVtolCC9SY3QNlpx3cE1w aff8d28bfb4d693253511d29e8d399196e964fc096594ec705748a5469d44654\ngregf@fedoragreg Jdn8I/sFGcr5Aa/dici6lw 50f88afc35c05ef8454742226f7baf2cd20cb1e2d4d0c9f4a393013877736bfa\n', u'name': u'gregf'}, u'changed': False, u'name': u'gregf', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gregf', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1117, u'groups': u'sudo', u'home': u'/home/joshd', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1117, '_ansible_item_label': {u'ovpn': u'jdurgin@glarthnir ElGaAgbs5VZujuzsQMfCmA 2a156febba037d02d1099bc11d1e697d34300b2c420f2df664b5b0de1248f983\njdurgin@new-angeles jqa015PRJcHSp5WHcwJjUg 42113e1156382fde866d691f30584f6b30c3dfc21317ae89b4267efb177d982c\n', u'name': u'joshd'}, 'item': {u'ovpn': u'jdurgin@glarthnir ElGaAgbs5VZujuzsQMfCmA 2a156febba037d02d1099bc11d1e697d34300b2c420f2df664b5b0de1248f983\njdurgin@new-angeles jqa015PRJcHSp5WHcwJjUg 42113e1156382fde866d691f30584f6b30c3dfc21317ae89b4267efb177d982c\n', u'name': u'joshd'}, u'changed': False, u'name': u'joshd', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'joshd', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1118, u'groups': u'sudo', u'home': u'/home/davidz', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1118, '_ansible_item_label': {u'ovpn': u'dzafman@ubuntu-laptop-vm NY9y9tqLY1beEcXDwMavsQ c869d42fae1890574577a8014d97d1247f1a13cb6337037d2714f1d236fc65d2\ndzafman@ubuntu16 2a0rAy5QmNFSEcATNz2h9A b7c11fbb0911fc4ac0216a1a8eac8359a9e8f43d69126db6b45cbeabd358c2b4\ndzafman@ubuntu-1804 PN1pkeGHGloB0K+IZrfB0g f1c01b447b9ec3fc048c32f606a33fb488ff621e11aa305ac979501030202658\n', u'name': u'davidz'}, 'item': {u'ovpn': u'dzafman@ubuntu-laptop-vm NY9y9tqLY1beEcXDwMavsQ c869d42fae1890574577a8014d97d1247f1a13cb6337037d2714f1d236fc65d2\ndzafman@ubuntu16 2a0rAy5QmNFSEcATNz2h9A b7c11fbb0911fc4ac0216a1a8eac8359a9e8f43d69126db6b45cbeabd358c2b4\ndzafman@ubuntu-1804 PN1pkeGHGloB0K+IZrfB0g f1c01b447b9ec3fc048c32f606a33fb488ff621e11aa305ac979501030202658\n', u'name': u'davidz'}, u'changed': False, u'name': u'davidz', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'davidz', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1119, u'groups': u'sudo', u'home': u'/home/gmeno', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1119, '_ansible_item_label': {u'ovpn': u'gmeno@gmeno-virtual-machine FKFu8B2pMqotpmEVAO1few 8229574e499eaf767a408909f5afdf2e2a0bb8f3e61b18d63a651f7102c68dbc', u'name': u'gmeno'}, 'item': {u'ovpn': u'gmeno@gmeno-virtual-machine FKFu8B2pMqotpmEVAO1few 8229574e499eaf767a408909f5afdf2e2a0bb8f3e61b18d63a651f7102c68dbc', u'name': u'gmeno'}, u'changed': False, u'name': u'gmeno', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gmeno', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1120, u'groups': u'sudo', u'home': u'/home/ivancich', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1120, '_ansible_item_label': {u'ovpn': u'ivancich@ann.arbor Kt2vxZ3Ge609mHfjx0W4Cw aaa55a9e2b5276b62a21cd3c401b365c5c2693e39efccb2f9edefafefa1dc8b1', u'name': u'ivancich'}, 'item': {u'ovpn': u'ivancich@ann.arbor Kt2vxZ3Ge609mHfjx0W4Cw aaa55a9e2b5276b62a21cd3c401b365c5c2693e39efccb2f9edefafefa1dc8b1', u'name': u'ivancich'}, u'changed': False, u'name': u'ivancich', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ivancich', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1121, u'groups': u'sudo', u'home': u'/home/wusui', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1121, '_ansible_item_label': {u'ovpn': u'wusui@ubuntu /nElLTGqxiOr/hD6eziF5A 2e244e2b62fa42dadf3a3a1dbcc29475410cee6550b0c6b3603c135853937551\nwusui@thinkpad tu2DxDcllIdwb5ewldgT0g 1590a7d9f1377b0094e9ba4277e7bcbe6374791f0b3d3df93026345c058c93f5\n', u'name': u'wusui'}, 'item': {u'ovpn': u'wusui@ubuntu /nElLTGqxiOr/hD6eziF5A 2e244e2b62fa42dadf3a3a1dbcc29475410cee6550b0c6b3603c135853937551\nwusui@thinkpad tu2DxDcllIdwb5ewldgT0g 1590a7d9f1377b0094e9ba4277e7bcbe6374791f0b3d3df93026345c058c93f5\n', u'name': u'wusui'}, u'changed': False, u'name': u'wusui', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'wusui', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1122, u'groups': u'sudo', u'home': u'/home/zyan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1122, '_ansible_item_label': {u'ovpn': u'zyan@redhat OP16IkReCatMfA4Mf3pkdQ b0262be71ef008e2f7d585e34431dc2757c1e22ac1aa844863be533bf873d304', u'name': u'zyan'}, 'item': {u'ovpn': u'zyan@redhat OP16IkReCatMfA4Mf3pkdQ b0262be71ef008e2f7d585e34431dc2757c1e22ac1aa844863be533bf873d304', u'name': u'zyan'}, u'changed': False, u'name': u'zyan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'zyan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1123, u'groups': u'sudo', u'home': u'/home/yuriw', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1123, '_ansible_item_label': {u'ovpn': u'yuriweinstein@Yuris-MacBook-Pro.local wKA8mCcsdsk/CE+1d9GiPw caaf5bf5eb31ef269e3d0bc34d45dd761c0bb0907172eea6e754434de4603db7\nyuriw@home 02TZyR3JHJMxEQob80ICNA 85b4aa0f69f6d469dae0bb3dca4baaf222e164927ed7eed2082caae8f4717e48\nyuriweinstein@xenon1 C9eVdLb/i18lMcMG20rGPw eaddd0e831a77de3f35cb19e307bd6f9aeb0cc03eff1ae58490d7db33ced4311\nyuriw@yuriw-RH 5ivdxgFO4eIkbXVhl8xkvw 59212d29b8b42d9fe457c1b2c43d774e1d25807be10dcc1252d4aec63b97a467\n', u'name': u'yuriw'}, 'item': {u'ovpn': u'yuriweinstein@Yuris-MacBook-Pro.local wKA8mCcsdsk/CE+1d9GiPw caaf5bf5eb31ef269e3d0bc34d45dd761c0bb0907172eea6e754434de4603db7\nyuriw@home 02TZyR3JHJMxEQob80ICNA 85b4aa0f69f6d469dae0bb3dca4baaf222e164927ed7eed2082caae8f4717e48\nyuriweinstein@xenon1 C9eVdLb/i18lMcMG20rGPw eaddd0e831a77de3f35cb19e307bd6f9aeb0cc03eff1ae58490d7db33ced4311\nyuriw@yuriw-RH 5ivdxgFO4eIkbXVhl8xkvw 59212d29b8b42d9fe457c1b2c43d774e1d25807be10dcc1252d4aec63b97a467\n', u'name': u'yuriw'}, u'changed': False, u'name': u'yuriw', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yuriw', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1124, u'groups': u'sudo', u'home': u'/home/tamil', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1124, '_ansible_item_label': {u'ovpn': u'tmuthamizhan@mac /1CBLC6rxqzJzPspffZV0g 4d1dfff2e097a7fc2a83ea73eccad2f0e453a6338e18c87b4d89bf370733e29c\ntamil@tamil-VirtualBox M22QdmhkSPj9aEcTiuIVfg 8e76e06b14de99318441c75a96e635a92f5bddc54a40b45276191f6829c6b239\n', u'name': u'tamil'}, 'item': {u'ovpn': u'tmuthamizhan@mac /1CBLC6rxqzJzPspffZV0g 4d1dfff2e097a7fc2a83ea73eccad2f0e453a6338e18c87b4d89bf370733e29c\ntamil@tamil-VirtualBox M22QdmhkSPj9aEcTiuIVfg 8e76e06b14de99318441c75a96e635a92f5bddc54a40b45276191f6829c6b239\n', u'name': u'tamil'}, u'changed': False, u'name': u'tamil', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'tamil', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1125, u'groups': u'sudo', u'home': u'/home/jowilkin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1125, '_ansible_item_label': {u'ovpn': u'jowilkin@jowilkin 2r8xOv/eCTcHr+HSsMkPYg 0ac416d2dc139628144dfa0822af8cc0a455f5f5f3e4d1b9713c14115c062218\njohn@osd-host 7zjDTxAYhCmTX+Az4SJaoA 7d924233fdef168e2c5c01258aa349de108629ef2ff90d17c0b96acf22dac7c2\njohn@admin-host 7cpk7iJ1Hg2vk4bPDovKmA 05765178f27af6dc4e43e47f52d773aac3bc1b3f1dd998bdbf479b951bfd2efb\n', u'name': u'jowilkin'}, 'item': {u'ovpn': u'jowilkin@jowilkin 2r8xOv/eCTcHr+HSsMkPYg 0ac416d2dc139628144dfa0822af8cc0a455f5f5f3e4d1b9713c14115c062218\njohn@osd-host 7zjDTxAYhCmTX+Az4SJaoA 7d924233fdef168e2c5c01258aa349de108629ef2ff90d17c0b96acf22dac7c2\njohn@admin-host 7cpk7iJ1Hg2vk4bPDovKmA 05765178f27af6dc4e43e47f52d773aac3bc1b3f1dd998bdbf479b951bfd2efb\n', u'name': u'jowilkin'}, u'changed': False, u'name': u'jowilkin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jowilkin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1126, u'groups': u'sudo', u'home': u'/home/bhubbard', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1126, '_ansible_item_label': {u'ovpn': u'brad@dhcp-1-165.bne.redhat.com 4oShQI9+vYtX5gA47np/Sw 3fc7df5afa772752d8eee15c01d550cc1dcc88b6e940abc9f9f8f26102d239d4', u'name': u'bhubbard'}, 'item': {u'ovpn': u'brad@dhcp-1-165.bne.redhat.com 4oShQI9+vYtX5gA47np/Sw 3fc7df5afa772752d8eee15c01d550cc1dcc88b6e940abc9f9f8f26102d239d4', u'name': u'bhubbard'}, u'changed': False, u'name': u'bhubbard', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'bhubbard', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1127, u'groups': u'sudo', u'home': u'/home/yehudasa', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1127, '_ansible_item_label': {u'ovpn': u'yehudasa@yehuda-2 NGXHeO0lAvRFfYLalffzEw 0d3489b09f48ad5fa9d1ec5944d9c8020daf852488f389f3d44fe63d3f651f34\nyehuda@yehuda-940X3G shisK5LjI6fr3ZBJy/xX8A 49522899bd26130086ce668079f0062d987d85dfa5767dd5c34e5953db97997a\nyehudasa@yehudasa-desktop OT1MhoO0WihhvkKztqW0Uw 12a4d6b54390b9df7f5af3bd6b533f3c1fee0c7b9fbb79f0a87bcb28b182c7d4\n', u'name': u'yehudasa'}, 'item': {u'ovpn': u'yehudasa@yehuda-2 NGXHeO0lAvRFfYLalffzEw 0d3489b09f48ad5fa9d1ec5944d9c8020daf852488f389f3d44fe63d3f651f34\nyehuda@yehuda-940X3G shisK5LjI6fr3ZBJy/xX8A 49522899bd26130086ce668079f0062d987d85dfa5767dd5c34e5953db97997a\nyehudasa@yehudasa-desktop OT1MhoO0WihhvkKztqW0Uw 12a4d6b54390b9df7f5af3bd6b533f3c1fee0c7b9fbb79f0a87bcb28b182c7d4\n', u'name': u'yehudasa'}, u'changed': False, u'name': u'yehudasa', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yehudasa', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1128, u'groups': u'sudo', u'home': u'/home/dang', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1128, '_ansible_item_label': {u'ovpn': u'dang@sidious w0CNW2g9K1WiRenVGYWNUA 4f59d761bfab3659115da2b3b80a486266f77b09d8527983217d15648b4f92b4', u'name': u'dang'}, 'item': {u'ovpn': u'dang@sidious w0CNW2g9K1WiRenVGYWNUA 4f59d761bfab3659115da2b3b80a486266f77b09d8527983217d15648b4f92b4', u'name': u'dang'}, u'changed': False, u'name': u'dang', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'dang', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1129, u'groups': u'sudo', u'home': u'/home/branto', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1129, '_ansible_item_label': {u'ovpn': u'branto@work ganye+HpG3dkMEik6WtTng 018f3f9b9d49dcefa701ea304a8e58f002c46f0650edae220a0a7ab1bce36aeb', u'name': u'branto'}, 'item': {u'ovpn': u'branto@work ganye+HpG3dkMEik6WtTng 018f3f9b9d49dcefa701ea304a8e58f002c46f0650edae220a0a7ab1bce36aeb', u'name': u'branto'}, u'changed': False, u'name': u'branto', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'branto', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1130, u'groups': u'sudo', u'home': u'/home/xiaoxichen', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1130, '_ansible_item_label': {u'ovpn': u'xiaoxichen@ebay RvJJ7BhIehpoPtggrwnskQ 862ecfe7e15dfab86d61df86856bfe06cbb99f240f6f03851f7f9e1a255327d6', u'name': u'xiaoxichen'}, 'item': {u'ovpn': u'xiaoxichen@ebay RvJJ7BhIehpoPtggrwnskQ 862ecfe7e15dfab86d61df86856bfe06cbb99f240f6f03851f7f9e1a255327d6', u'name': u'xiaoxichen'}, u'changed': False, u'name': u'xiaoxichen', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'xiaoxichen', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1131, u'groups': u'sudo', u'home': u'/home/ffilz', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1131, '_ansible_item_label': {u'ovpn': u'ffilz@redhat.com 6YdvqxkKfmDWGD2s0wA7Ww 4ce64d08686e34e559ccec2498df433b155b70c9ebccaec616b6b34f0f0c246e', u'name': u'ffilz'}, 'item': {u'ovpn': u'ffilz@redhat.com 6YdvqxkKfmDWGD2s0wA7Ww 4ce64d08686e34e559ccec2498df433b155b70c9ebccaec616b6b34f0f0c246e', u'name': u'ffilz'}, u'changed': False, u'name': u'ffilz', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ffilz', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1132, u'groups': u'sudo', u'home': u'/home/joao', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1132, '_ansible_item_label': {u'ovpn': u'joao@magrathea eSS2/gvK7ALE6L9bITfuMA c3caaeecee3f43e39b7a81fad50e0d874359c70a9c41b77c661511c71f733909\njoao@timesink 9S3oER36HheVupjRpnLz6A 9dbc964184244e9da269942dc73ec9ebba6594bcccfdc0eb09562b58b4542162\n', u'name': u'joao'}, 'item': {u'ovpn': u'joao@magrathea eSS2/gvK7ALE6L9bITfuMA c3caaeecee3f43e39b7a81fad50e0d874359c70a9c41b77c661511c71f733909\njoao@timesink 9S3oER36HheVupjRpnLz6A 9dbc964184244e9da269942dc73ec9ebba6594bcccfdc0eb09562b58b4542162\n', u'name': u'joao'}, u'changed': False, u'name': u'joao', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'joao', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1133, u'groups': u'sudo', u'home': u'/home/nhm', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1133, '_ansible_item_label': {u'ovpn': u'nh@latte JCH9icAtDPj951rgbPKJyw 9a3d30ec1ec9467ccdc8bdbbfacffd7396fb38e046199ae37b2b7b69dbf37480\nnhm@espresso +YYZPT29wYzY5ooaRzabCQ 1ee041dd58b9ec6eb678c47632ece7cf6c24e23bcbac28a77a82af05ba6cc148\nnhm@mocha HgOGOfkBEzJihFsKmPRfKQ 2e17f3ba0b90df7a36f19a7c8f64d2aa8f966b2794c94caa110d313e927a1c1b\n', u'name': u'nhm'}, 'item': {u'ovpn': u'nh@latte JCH9icAtDPj951rgbPKJyw 9a3d30ec1ec9467ccdc8bdbbfacffd7396fb38e046199ae37b2b7b69dbf37480\nnhm@espresso +YYZPT29wYzY5ooaRzabCQ 1ee041dd58b9ec6eb678c47632ece7cf6c24e23bcbac28a77a82af05ba6cc148\nnhm@mocha HgOGOfkBEzJihFsKmPRfKQ 2e17f3ba0b90df7a36f19a7c8f64d2aa8f966b2794c94caa110d313e927a1c1b\n', u'name': u'nhm'}, u'changed': False, u'name': u'nhm', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'nhm', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1134, u'groups': u'sudo', u'home': u'/home/jj', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1134, '_ansible_item_label': {u'ovpn': u'jj@aurora PFzIfG7NjrCYxJvxd//soA 395884ddf2caa0043f97e4f98530cc3ad646d6f269802a2b79ca1ea7278ee006\njj@metatron iQogIw/KQtewT7oj9Mkivw 0881e5ceb5897e8a370bacee69ad58eb5098090ea4b0d53972214ea7c751e35a\njj@laptop O1e31whZbQ0S7MUtglCRLg 96e39257989ce36e240b5d368e0308d38009d3d923ec398dc9cc6eba371acaa4\njj@aurora2 EtAvlrozxiL3PLYp6mvATg 1018928736c33ed06246f208cd02aa10c0a6efa5b4e34e32408d7a6c72c32e11\n', u'name': u'jj'}, 'item': {u'ovpn': u'jj@aurora PFzIfG7NjrCYxJvxd//soA 395884ddf2caa0043f97e4f98530cc3ad646d6f269802a2b79ca1ea7278ee006\njj@metatron iQogIw/KQtewT7oj9Mkivw 0881e5ceb5897e8a370bacee69ad58eb5098090ea4b0d53972214ea7c751e35a\njj@laptop O1e31whZbQ0S7MUtglCRLg 96e39257989ce36e240b5d368e0308d38009d3d923ec398dc9cc6eba371acaa4\njj@aurora2 EtAvlrozxiL3PLYp6mvATg 1018928736c33ed06246f208cd02aa10c0a6efa5b4e34e32408d7a6c72c32e11\n', u'name': u'jj'}, u'changed': False, u'name': u'jj', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jj', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1135, u'groups': u'sudo', u'home': u'/home/nwatkins', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1135, '_ansible_item_label': {u'ovpn': u'nwatkins@daq unHfXkUpAHOnzeptznuLLA 33b2003c30d0cc7a6b194e76be92c7b5d270c2d2a4b4a8b6e673f0f0dc1db313', u'name': u'nwatkins'}, 'item': {u'ovpn': u'nwatkins@daq unHfXkUpAHOnzeptznuLLA 33b2003c30d0cc7a6b194e76be92c7b5d270c2d2a4b4a8b6e673f0f0dc1db313', u'name': u'nwatkins'}, u'changed': False, u'name': u'nwatkins', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'nwatkins', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1136, u'groups': u'sudo', u'home': u'/home/mkidd', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1136, '_ansible_item_label': {u'ovpn': u'linuxkidd@zenbook oYp1WWV0JwpikHCWCV52Lg 9aca455b601bf3a365d31068154150ac63dd76f32cef29a55f9685dd1a88aa22', u'name': u'mkidd'}, 'item': {u'ovpn': u'linuxkidd@zenbook oYp1WWV0JwpikHCWCV52Lg 9aca455b601bf3a365d31068154150ac63dd76f32cef29a55f9685dd1a88aa22', u'name': u'mkidd'}, u'changed': False, u'name': u'mkidd', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mkidd', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1137, u'groups': u'sudo', u'home': u'/home/jlopez', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1137, '_ansible_item_label': {u'ovpn': u'JCLO@oh-mbp-jcl vZhmBh/1LjLFEu+atRec6w 1f13f591373b4dc798a9b701fabf1eb99bf4aa58f87b6420d6c916716f0965af', u'name': u'jlopez'}, 'item': {u'ovpn': u'JCLO@oh-mbp-jcl vZhmBh/1LjLFEu+atRec6w 1f13f591373b4dc798a9b701fabf1eb99bf4aa58f87b6420d6c916716f0965af', u'name': u'jlopez'}, u'changed': False, u'name': u'jlopez', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jlopez', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1138, u'groups': u'sudo', u'home': u'/home/haomaiwang', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1138, '_ansible_item_label': {u'ovpn': u'yuyuyu101@desktop XAUS/1Geh1T2WY//5mRahw fda03bdaf79c2f39ac3ba6cd9c3a1cb2e66b842a921169f20a00481a4cd3d9cb', u'name': u'haomaiwang'}, 'item': {u'ovpn': u'yuyuyu101@desktop XAUS/1Geh1T2WY//5mRahw fda03bdaf79c2f39ac3ba6cd9c3a1cb2e66b842a921169f20a00481a4cd3d9cb', u'name': u'haomaiwang'}, u'changed': False, u'name': u'haomaiwang', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'haomaiwang', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1139, u'groups': u'sudo', u'home': u'/home/jdillaman', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1139, '_ansible_item_label': {u'ovpn': u'jdillaman@ceph1 kQ96pIpKTocwIj9H1Udb1g 6c087451af94ac18c144712bcc9329799d86f6d90376839dcd6fa41cd73e3608', u'name': u'jdillaman'}, 'item': {u'ovpn': u'jdillaman@ceph1 kQ96pIpKTocwIj9H1Udb1g 6c087451af94ac18c144712bcc9329799d86f6d90376839dcd6fa41cd73e3608', u'name': u'jdillaman'}, u'changed': False, u'name': u'jdillaman', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jdillaman', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1140, u'groups': u'sudo', u'home': u'/home/kchai', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1140, '_ansible_item_label': {u'ovpn': u'kefu@gen8 HVoNrB5C8+bYxuqIfByeEQ 4dddde1890af2d6df367d3d832cc3b9b660160a1db69f0135e0d09364b2cb9b3', u'name': u'kchai'}, 'item': {u'ovpn': u'kefu@gen8 HVoNrB5C8+bYxuqIfByeEQ 4dddde1890af2d6df367d3d832cc3b9b660160a1db69f0135e0d09364b2cb9b3', u'name': u'kchai'}, u'changed': False, u'name': u'kchai', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kchai', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1141, u'groups': u'sudo', u'home': u'/home/vumrao', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1141, '_ansible_item_label': {u'ovpn': u'vumrao@vumrao VaMHdnIGTl6y9LIkurxfjQ 71de53c4a0f212b8f919437d7d433d24a33d7a33bc6fe5df5d047e24499994b2', u'name': u'vumrao'}, 'item': {u'ovpn': u'vumrao@vumrao VaMHdnIGTl6y9LIkurxfjQ 71de53c4a0f212b8f919437d7d433d24a33d7a33bc6fe5df5d047e24499994b2', u'name': u'vumrao'}, u'changed': False, u'name': u'vumrao', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vumrao', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1142, u'groups': u'sudo', u'home': u'/home/dfuller', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1142, '_ansible_item_label': {u'ovpn': u'dfuller@laptop 6U0lNVidRy7Ye/5q6cGN1A aea3d019f68a95094c99385aff099818224455a829615cfc774587e4519398a7\ndfuller@rex001 6Z8bfQDgPXVSGMeeQHoItg 3abd41920b72683fbba7f25be88ff992fcd753119c4d2086c12daaf20798e684\n', u'name': u'dfuller'}, 'item': {u'ovpn': u'dfuller@laptop 6U0lNVidRy7Ye/5q6cGN1A aea3d019f68a95094c99385aff099818224455a829615cfc774587e4519398a7\ndfuller@rex001 6Z8bfQDgPXVSGMeeQHoItg 3abd41920b72683fbba7f25be88ff992fcd753119c4d2086c12daaf20798e684\n', u'name': u'dfuller'}, u'changed': False, u'name': u'dfuller', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'dfuller', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1143, u'groups': u'sudo', u'home': u'/home/owasserm', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1143, '_ansible_item_label': {u'ovpn': u'owasserm@owasserm.redhat.com hsdVTbVub6eRnhlO9B02rQ 7c9baf41670ff9ab612f75d4be42d0aaf0d7ecaa3c8928032b61f1be91725890\n', u'name': u'owasserm'}, 'item': {u'ovpn': u'owasserm@owasserm.redhat.com hsdVTbVub6eRnhlO9B02rQ 7c9baf41670ff9ab612f75d4be42d0aaf0d7ecaa3c8928032b61f1be91725890\n', u'name': u'owasserm'}, u'changed': False, u'name': u'owasserm', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'owasserm', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1144, u'groups': u'sudo', u'home': u'/home/abhishekvrshny', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1144, '_ansible_item_label': {u'ovpn': u'abhishekvrshny@flipkart QTqbWHaqvXwB+yBy6CVO7A 25d026c49dc49b3a1f445d2dc0099d5ed916645b0adb8d0306269ace7a2096e9', u'name': u'abhishekvrshny'}, 'item': {u'ovpn': u'abhishekvrshny@flipkart QTqbWHaqvXwB+yBy6CVO7A 25d026c49dc49b3a1f445d2dc0099d5ed916645b0adb8d0306269ace7a2096e9', u'name': u'abhishekvrshny'}, u'changed': False, u'name': u'abhishekvrshny', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'abhishekvrshny', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1145, u'groups': u'sudo', u'home': u'/home/vasu', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1145, '_ansible_item_label': {u'ovpn': u'vasu@ceph1 +1D1qNUAk8h7OF9LF9qkrQ 963aa256fb99bc4b77e7085f57cf910a04c7f143603f81252331411eb8e37ec3\nvakulkar@vakulkar.sjc.csb O8ac1k0Dh3xkIFX8NFyIZw 471538eeb22384b58921e4f11af272c00c0a953dc7fe8d95ba057e65d141fbd2\nvasu@vasuSrv waJqYAARY/LnfuP1x/KQzQ 68915d3a1eb3dd00a562c149791cec5f43a96f5fd0b851ec855ec3f5dab496b4\n', u'name': u'vasu'}, 'item': {u'ovpn': u'vasu@ceph1 +1D1qNUAk8h7OF9LF9qkrQ 963aa256fb99bc4b77e7085f57cf910a04c7f143603f81252331411eb8e37ec3\nvakulkar@vakulkar.sjc.csb O8ac1k0Dh3xkIFX8NFyIZw 471538eeb22384b58921e4f11af272c00c0a953dc7fe8d95ba057e65d141fbd2\nvasu@vasuSrv waJqYAARY/LnfuP1x/KQzQ 68915d3a1eb3dd00a562c149791cec5f43a96f5fd0b851ec855ec3f5dab496b4\n', u'name': u'vasu'}, u'changed': False, u'name': u'vasu', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vasu', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1146, u'groups': u'sudo', u'home': u'/home/smohan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1146, '_ansible_item_label': {u'ovpn': u'shmohan@laptop 7wwHLZZNa4ShUV1imXDDjw 0aca19a8ff6dbeee2821dd75a329a05b8052170204b2b242ced9b1a68ca8df37', u'name': u'smohan'}, 'item': {u'ovpn': u'shmohan@laptop 7wwHLZZNa4ShUV1imXDDjw 0aca19a8ff6dbeee2821dd75a329a05b8052170204b2b242ced9b1a68ca8df37', u'name': u'smohan'}, u'changed': False, u'name': u'smohan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'smohan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1147, u'groups': u'sudo', u'home': u'/home/abhi', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1147, '_ansible_item_label': {u'ovpn': u'abhi@trusty SKarQTpBigBobP9sLjdLiw 868a74ed21b46f7f64255897d824f4e3eb21f8dde844bbdaa386681c942d8114', u'name': u'abhi'}, 'item': {u'ovpn': u'abhi@trusty SKarQTpBigBobP9sLjdLiw 868a74ed21b46f7f64255897d824f4e3eb21f8dde844bbdaa386681c942d8114', u'name': u'abhi'}, u'changed': False, u'name': u'abhi', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'abhi', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1148, u'groups': u'sudo', u'home': u'/home/shehbazj', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1148, '_ansible_item_label': {u'ovpn': u'shehbazj@smrhost oEliXj/jPmfXv9AUAFKB5A 8bb3a682b4ff15de655c0fe610d350c5805d0a970471e4810b648f47e2811246', u'name': u'shehbazj'}, 'item': {u'ovpn': u'shehbazj@smrhost oEliXj/jPmfXv9AUAFKB5A 8bb3a682b4ff15de655c0fe610d350c5805d0a970471e4810b648f47e2811246', u'name': u'shehbazj'}, u'changed': False, u'name': u'shehbazj', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'shehbazj', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1149, u'groups': u'sudo', u'home': u'/home/cbodley', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1149, '_ansible_item_label': {u'ovpn': u'cbodley@redhat.com W4gFS4oU8PtUOrdHuBYwXQ af7dafd992687f5d85a79866838a78e5070a4934fb0a935e8094adb31ec28611', u'name': u'cbodley'}, 'item': {u'ovpn': u'cbodley@redhat.com W4gFS4oU8PtUOrdHuBYwXQ af7dafd992687f5d85a79866838a78e5070a4934fb0a935e8094adb31ec28611', u'name': u'cbodley'}, u'changed': False, u'name': u'cbodley', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'cbodley', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1150, u'groups': u'sudo', u'home': u'/home/fche', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1150, '_ansible_item_label': {u'ovpn': u'fche@redhat.com LiziWkWg2uWoEHb3Ln92dQ 9c5497793758b069adbba9284dd55944276ba4dac0bb95d9357c81b58174a3c3', u'name': u'fche'}, 'item': {u'ovpn': u'fche@redhat.com LiziWkWg2uWoEHb3Ln92dQ 9c5497793758b069adbba9284dd55944276ba4dac0bb95d9357c81b58174a3c3', u'name': u'fche'}, u'changed': False, u'name': u'fche', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'fche', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1151, u'groups': u'sudo', u'home': u'/home/onyb', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1151, '_ansible_item_label': {u'ovpn': u'ani@orchid AgbLf9Dufuji5r+9WxM69g b73b7eacb9b628387a17cb1e0a84ff19c29d45dca8f0768e407aa599bc6996c4', u'name': u'onyb'}, 'item': {u'ovpn': u'ani@orchid AgbLf9Dufuji5r+9WxM69g b73b7eacb9b628387a17cb1e0a84ff19c29d45dca8f0768e407aa599bc6996c4', u'name': u'onyb'}, u'changed': False, u'name': u'onyb', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'onyb', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1152, u'groups': u'sudo', u'home': u'/home/mwatts', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1152, '_ansible_item_label': {u'ovpn': u'mdw@degu IPiZqcFT2BLuf2h3+tw58g 7af390a631ec11bddd7d1ac506d29af65e1e01e19f7dc931b4f459030cb7a195', u'name': u'mwatts'}, 'item': {u'ovpn': u'mdw@degu IPiZqcFT2BLuf2h3+tw58g 7af390a631ec11bddd7d1ac506d29af65e1e01e19f7dc931b4f459030cb7a195', u'name': u'mwatts'}, u'changed': False, u'name': u'mwatts', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mwatts', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1153, u'groups': u'sudo', u'home': u'/home/oprypin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1153, '_ansible_item_label': {u'ovpn': u'oprypin@bx-pc-arch 5WYD7PWKwdSQjfGo95ehWw f3c9f170d74c6c443cf0f82f0d87045e1d3a0dbcd01474f6e667ea20a00978b3', u'name': u'oprypin'}, 'item': {u'ovpn': u'oprypin@bx-pc-arch 5WYD7PWKwdSQjfGo95ehWw f3c9f170d74c6c443cf0f82f0d87045e1d3a0dbcd01474f6e667ea20a00978b3', u'name': u'oprypin'}, u'changed': False, u'name': u'oprypin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'oprypin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1154, u'groups': u'sudo', u'home': u'/home/prsrivas', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1154, '_ansible_item_label': {u'ovpn': u'pritha@dhcp35-190.lab.eng.blr.redhat.com GCk3PIYzUNZ6/xrsKoq8VQ acbfc4279970b44c3d008990e0cf1bb5eb280299218441a0f25fda988bc555f6', u'name': u'prsrivas'}, 'item': {u'ovpn': u'pritha@dhcp35-190.lab.eng.blr.redhat.com GCk3PIYzUNZ6/xrsKoq8VQ acbfc4279970b44c3d008990e0cf1bb5eb280299218441a0f25fda988bc555f6', u'name': u'prsrivas'}, u'changed': False, u'name': u'prsrivas', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'prsrivas', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1155, u'groups': u'sudo', u'home': u'/home/pdonnell', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1155, '_ansible_item_label': {u'ovpn': u'pdonnell@redhat.com Q9j56CPPXipXScmKi57PlQ fb616603b6d27cf65bfa1da83fc0ca39399861ad1c02bfed37ce9be17cdfa8ea', u'name': u'pdonnell'}, 'item': {u'ovpn': u'pdonnell@redhat.com Q9j56CPPXipXScmKi57PlQ fb616603b6d27cf65bfa1da83fc0ca39399861ad1c02bfed37ce9be17cdfa8ea', u'name': u'pdonnell'}, u'changed': False, u'name': u'pdonnell', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pdonnell', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1156, u'groups': u'sudo', u'home': u'/home/jlayton', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1156, '_ansible_item_label': {u'ovpn': u'jlayton@redhat aNfzMdXOhhmWRb25hwXJIg f51fee42c5268f7b8e00d57092dc522b0a07b31154ea52cf542da9cac5885868', u'name': u'jlayton'}, 'item': {u'ovpn': u'jlayton@redhat aNfzMdXOhhmWRb25hwXJIg f51fee42c5268f7b8e00d57092dc522b0a07b31154ea52cf542da9cac5885868', u'name': u'jlayton'}, u'changed': False, u'name': u'jlayton', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jlayton', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1157, u'groups': u'sudo', u'home': u'/home/rzarzynski', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1157, '_ansible_item_label': {u'ovpn': u'rzarzyns@redhat.com h4UkibgFG40ygfpfKTMBrg e20ca28c60144dbabc97953cd4c273c1b92cd45ebcddd0f0299679d7a5c87d7f', u'name': u'rzarzynski'}, 'item': {u'ovpn': u'rzarzyns@redhat.com h4UkibgFG40ygfpfKTMBrg e20ca28c60144dbabc97953cd4c273c1b92cd45ebcddd0f0299679d7a5c87d7f', u'name': u'rzarzynski'}, u'changed': False, u'name': u'rzarzynski', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rzarzynski', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1158, u'groups': u'sudo', u'home': u'/home/rdias', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1158, '_ansible_item_label': {u'ovpn': u'rdias@rdias-suse-laptop 0bh54sdB69mh95n5rWME5g 452e3338e48d04d4e816f4f1cb54d637746a7acc1ffe5e8ed4c1506c8e07a72e', u'name': u'rdias'}, 'item': {u'ovpn': u'rdias@rdias-suse-laptop 0bh54sdB69mh95n5rWME5g 452e3338e48d04d4e816f4f1cb54d637746a7acc1ffe5e8ed4c1506c8e07a72e', u'name': u'rdias'}, u'changed': False, u'name': u'rdias', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rdias', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1159, u'groups': u'sudo', u'home': u'/home/asheplyakov', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1159, '_ansible_item_label': {u'ovpn': u'asheplyakov@asheplyakov.srt.mirantis.net wFW0ZgT4cNhKRAGXiUtevQ 1b11f0702b2db42a42aae6579737ece2caad3b80a8186b971686575cb76b3051', u'name': u'asheplyakov'}, 'item': {u'ovpn': u'asheplyakov@asheplyakov.srt.mirantis.net wFW0ZgT4cNhKRAGXiUtevQ 1b11f0702b2db42a42aae6579737ece2caad3b80a8186b971686575cb76b3051', u'name': u'asheplyakov'}, u'changed': False, u'name': u'asheplyakov', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'asheplyakov', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1160, u'groups': u'sudo', u'home': u'/home/vshankar', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1160, '_ansible_item_label': {u'ovpn': u'vshankar@h3ckers-pride r39frQXXj1GUJZwq1GS7fw 1170ef4c918c5ff15334d10f666441b0dfe0bb869a5e15218fdfad2e8cc4e953', u'name': u'vshankar'}, 'item': {u'ovpn': u'vshankar@h3ckers-pride r39frQXXj1GUJZwq1GS7fw 1170ef4c918c5ff15334d10f666441b0dfe0bb869a5e15218fdfad2e8cc4e953', u'name': u'vshankar'}, u'changed': False, u'name': u'vshankar', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vshankar', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1161, u'groups': u'sudo', u'home': u'/home/akupczyk', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1161, '_ansible_item_label': {u'ovpn': u'adam@TP50 C0YuBT9bYaNhdDmjbF56xg 5d298b33b9dbaef364b037561aa5c5de374405bb8afead5280db5b212506ea58', u'name': u'akupczyk'}, 'item': {u'ovpn': u'adam@TP50 C0YuBT9bYaNhdDmjbF56xg 5d298b33b9dbaef364b037561aa5c5de374405bb8afead5280db5b212506ea58', u'name': u'akupczyk'}, u'changed': False, u'name': u'akupczyk', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'akupczyk', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1162, u'groups': u'sudo', u'home': u'/home/nojha', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1162, '_ansible_item_label': {u'ovpn': u'nojha@localhost YQTw/h6ZMgdJn7EPBmbEnw 253574eae62759c4c5d3bc4bf949c59948a0488e4dfe4c91ee754a3b5494847e', u'name': u'nojha'}, 'item': {u'ovpn': u'nojha@localhost YQTw/h6ZMgdJn7EPBmbEnw 253574eae62759c4c5d3bc4bf949c59948a0488e4dfe4c91ee754a3b5494847e', u'name': u'nojha'}, u'changed': False, u'name': u'nojha', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'nojha', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1163, u'groups': u'sudo', u'home': u'/home/ifed01', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1163, '_ansible_item_label': {u'ovpn': u'ifed01@LocalSUSE1 6g6hX1bzTGBTCnDevAn0+w f18c9354f6de3f371c3b51521b62375e474802ac21adb3d71e09d8d5bf9d0c43', u'name': u'ifed01'}, 'item': {u'ovpn': u'ifed01@LocalSUSE1 6g6hX1bzTGBTCnDevAn0+w f18c9354f6de3f371c3b51521b62375e474802ac21adb3d71e09d8d5bf9d0c43', u'name': u'ifed01'}, u'changed': False, u'name': u'ifed01', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ifed01', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1164, u'groups': u'sudo', u'home': u'/home/myoungwon', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1164, '_ansible_item_label': {u'ovpn': u'myoungwon@omw-dev 78twXwYRU+MeH+yZ9Rw9Zg 2dd66fa122e5cf3b8bfa835cefac7c6e4e66d70643a3819813104c2057e597e4', u'name': u'myoungwon'}, 'item': {u'ovpn': u'myoungwon@omw-dev 78twXwYRU+MeH+yZ9Rw9Zg 2dd66fa122e5cf3b8bfa835cefac7c6e4e66d70643a3819813104c2057e597e4', u'name': u'myoungwon'}, u'changed': False, u'name': u'myoungwon', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'myoungwon', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1165, u'groups': u'sudo', u'home': u'/home/jwilliamson', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1165, '_ansible_item_label': {u'ovpn': u'jwilliamson@glaurung +d028NM7xCxkVdxxO+b1Lw fece65125073fdc287af724ee4724ad84d2864e758d50dcb23c07b05c3595fe0', u'name': u'jwilliamson'}, 'item': {u'ovpn': u'jwilliamson@glaurung +d028NM7xCxkVdxxO+b1Lw fece65125073fdc287af724ee4724ad84d2864e758d50dcb23c07b05c3595fe0', u'name': u'jwilliamson'}, u'changed': False, u'name': u'jwilliamson', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jwilliamson', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1166, u'groups': u'sudo', u'home': u'/home/gabrioux', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1166, '_ansible_item_label': {u'ovpn': u'gabrioux@elisheba 80kx1htp39RsFrlGONcp+A a95579ef6f90694cd6fd390302adf8532237a8ea65bd5544d9b561654d712ba2', u'name': u'gabrioux'}, 'item': {u'ovpn': u'gabrioux@elisheba 80kx1htp39RsFrlGONcp+A a95579ef6f90694cd6fd390302adf8532237a8ea65bd5544d9b561654d712ba2', u'name': u'gabrioux'}, u'changed': False, u'name': u'gabrioux', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gabrioux', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1167, u'groups': u'sudo', u'home': u'/home/leseb', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1167, '_ansible_item_label': {u'ovpn': u'leseb@mbp cCx1v5/FfaQ/IQHujqtG9Q 6121d11f9abfa6b1b36330eafaa2196249a9c92f989be25c9fac1558292c920f', u'name': u'leseb'}, 'item': {u'ovpn': u'leseb@mbp cCx1v5/FfaQ/IQHujqtG9Q 6121d11f9abfa6b1b36330eafaa2196249a9c92f989be25c9fac1558292c920f', u'name': u'leseb'}, u'changed': False, u'name': u'leseb', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'leseb', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1168, u'groups': u'sudo', u'home': u'/home/hchen', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1168, '_ansible_item_label': {u'ovpn': u'hchen@host12 hUr3k0rugStZMjvIxIvCOg 9d57e14d49901f18b24ee4076ae7e6a2f9eb6fd9fbce786660c448486c966fca', u'name': u'hchen'}, 'item': {u'ovpn': u'hchen@host12 hUr3k0rugStZMjvIxIvCOg 9d57e14d49901f18b24ee4076ae7e6a2f9eb6fd9fbce786660c448486c966fca', u'name': u'hchen'}, u'changed': False, u'name': u'hchen', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'hchen', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1169, u'groups': u'sudo', u'home': u'/home/jcollin', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1169, '_ansible_item_label': {u'ovpn': u'jcollin@earth +H4Hk4WcNuqdQj7ch/Nulw 8426545e6457c9e1e8adca2af5ddf836fbcfb433cdc5359fd135afdf4e0f7d2a\njcollin@stratocaster jbjV3FsrsTJwyKUA3Y8VVQ 0439745f795fef1399636bd550040d45445d1607b471284c5c9b9dbccc86a987\n', u'name': u'jcollin'}, 'item': {u'ovpn': u'jcollin@earth +H4Hk4WcNuqdQj7ch/Nulw 8426545e6457c9e1e8adca2af5ddf836fbcfb433cdc5359fd135afdf4e0f7d2a\njcollin@stratocaster jbjV3FsrsTJwyKUA3Y8VVQ 0439745f795fef1399636bd550040d45445d1607b471284c5c9b9dbccc86a987\n', u'name': u'jcollin'}, u'changed': False, u'name': u'jcollin', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'jcollin', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1170, u'groups': u'sudo', u'home': u'/home/xxg', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1170, '_ansible_item_label': {u'ovpn': u'xxg@zte Y2d/Ov201XMivDNwo4nUoQ 5e5da8d579793601699af628300430c1e5dd469c8bcff7c3ee11d23ec004bdcc', u'name': u'xxg'}, 'item': {u'ovpn': u'xxg@zte Y2d/Ov201XMivDNwo4nUoQ 5e5da8d579793601699af628300430c1e5dd469c8bcff7c3ee11d23ec004bdcc', u'name': u'xxg'}, u'changed': False, u'name': u'xxg', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'xxg', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1171, u'groups': u'sudo', u'home': u'/home/pcuzner', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1171, '_ansible_item_label': {u'ovpn': u'pcuzner@rh460p oK28wU5DSabvEL4VjDRhEg a449ed81d7e2970f418263fb3ce10dd711d03925a0990ddf298f826aae1caa53', u'name': u'pcuzner'}, 'item': {u'ovpn': u'pcuzner@rh460p oK28wU5DSabvEL4VjDRhEg a449ed81d7e2970f418263fb3ce10dd711d03925a0990ddf298f826aae1caa53', u'name': u'pcuzner'}, u'changed': False, u'name': u'pcuzner', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pcuzner', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1172, u'groups': u'sudo', u'home': u'/home/liupan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1172, '_ansible_item_label': {u'ovpn': u'liupan@ceph-dev-2 9+Is4mIZgNkYyJLwHvSNOA 5a8fafc187d52041daf4365125692d4619fc557b75560913130c0596f83bbb77', u'name': u'liupan'}, 'item': {u'ovpn': u'liupan@ceph-dev-2 9+Is4mIZgNkYyJLwHvSNOA 5a8fafc187d52041daf4365125692d4619fc557b75560913130c0596f83bbb77', u'name': u'liupan'}, u'changed': False, u'name': u'liupan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'liupan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1173, u'groups': u'sudo', u'home': u'/home/mkogan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1173, '_ansible_item_label': {u'ovpn': u'mkogan@mkP50 f/LENRJbsyepmvZA23F7Fg b908f1c0237a7ee56b73dc42f2df79b49ca83d6f4573f5229e7cfe6b4ad7b6a2', u'name': u'mkogan'}, 'item': {u'ovpn': u'mkogan@mkP50 f/LENRJbsyepmvZA23F7Fg b908f1c0237a7ee56b73dc42f2df79b49ca83d6f4573f5229e7cfe6b4ad7b6a2', u'name': u'mkogan'}, u'changed': False, u'name': u'mkogan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mkogan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1174, u'groups': u'sudo', u'home': u'/home/amarangone', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1174, '_ansible_item_label': {u'ovpn': u'amarangone@macair.local 9Fslt44BqONCYNhf+uhcnQ 12d46ec6815378a12abc5df00e65235ccbc06ffb0fe5d1db75540a4805cb58b6', u'name': u'amarangone'}, 'item': {u'ovpn': u'amarangone@macair.local 9Fslt44BqONCYNhf+uhcnQ 12d46ec6815378a12abc5df00e65235ccbc06ffb0fe5d1db75540a4805cb58b6', u'name': u'amarangone'}, u'changed': False, u'name': u'amarangone', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'amarangone', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1141, u'groups': u'sudo', u'home': u'/home/vumrao', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1141, '_ansible_item_label': {u'ovpn': u'vumrao@redhat TMNqzMvbJS8Va/8nT9QUQw ab386c2bd7c6796d5413e4d841a16dda2504cca6d95df831a652a30d2e5655ed', u'name': u'vumrao'}, 'item': {u'ovpn': u'vumrao@redhat TMNqzMvbJS8Va/8nT9QUQw ab386c2bd7c6796d5413e4d841a16dda2504cca6d95df831a652a30d2e5655ed', u'name': u'vumrao'}, u'changed': False, u'name': u'vumrao', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'vumrao', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1175, u'groups': u'sudo', u'home': u'/home/kmroz', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1175, '_ansible_item_label': {u'ovpn': u'kmroz@suse /JbrIs2mKL5exdmcDnhRgg db4d19ab99c7174429d5ae7b6ca3cf4e04e9bf7810e1826d90f4627643628d57', u'name': u'kmroz'}, 'item': {u'ovpn': u'kmroz@suse /JbrIs2mKL5exdmcDnhRgg db4d19ab99c7174429d5ae7b6ca3cf4e04e9bf7810e1826d90f4627643628d57', u'name': u'kmroz'}, u'changed': False, u'name': u'kmroz', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kmroz', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1176, u'groups': u'sudo', u'home': u'/home/henrix', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1176, '_ansible_item_label': {u'ovpn': u'henrix@hermes iPPDBfnLzP5Pe5FuTcJBmw b26aefb8a61451066f984e074f708ea9ca6b2c5d7cca35996c08b0b2bb2c2736', u'name': u'henrix'}, 'item': {u'ovpn': u'henrix@hermes iPPDBfnLzP5Pe5FuTcJBmw b26aefb8a61451066f984e074f708ea9ca6b2c5d7cca35996c08b0b2bb2c2736', u'name': u'henrix'}, u'changed': False, u'name': u'henrix', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'henrix', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1177, u'groups': u'sudo', u'home': u'/home/pbs1108', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1177, '_ansible_item_label': {u'ovpn': u'pbs1108@bspark_cli0 nSCINdeTLTLEO5JP/GIwRQ 76372ad6f7ad731556ff13605c3729eacaf59dcf7f9ac82dd9a8501bd95d3b26', u'name': u'pbs1108'}, 'item': {u'ovpn': u'pbs1108@bspark_cli0 nSCINdeTLTLEO5JP/GIwRQ 76372ad6f7ad731556ff13605c3729eacaf59dcf7f9ac82dd9a8501bd95d3b26', u'name': u'pbs1108'}, u'changed': False, u'name': u'pbs1108', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pbs1108', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1178, u'groups': u'sudo', u'home': u'/home/clacroix', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1178, '_ansible_item_label': {u'ovpn': u'clacroix@redhat.com ZGY9sgvjT0BuJRi9zrULzg c3311aba4025aa42cd78c999dcee4e2c16415a3ac44ac8c95b77838459ef3315', u'name': u'clacroix'}, 'item': {u'ovpn': u'clacroix@redhat.com ZGY9sgvjT0BuJRi9zrULzg c3311aba4025aa42cd78c999dcee4e2c16415a3ac44ac8c95b77838459ef3315', u'name': u'clacroix'}, u'changed': False, u'name': u'clacroix', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'clacroix', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1179, u'groups': u'sudo', u'home': u'/home/epuertat', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1179, '_ansible_item_label': {u'ovpn': u'epuertat@private BnUoirwxGRWXtLxulJU5xA da2cfc4995bed82ef85db3633edad0a7eb2c32ba559a48259b10be94a8fdf006', u'name': u'epuertat'}, 'item': {u'ovpn': u'epuertat@private BnUoirwxGRWXtLxulJU5xA da2cfc4995bed82ef85db3633edad0a7eb2c32ba559a48259b10be94a8fdf006', u'name': u'epuertat'}, u'changed': False, u'name': u'epuertat', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'epuertat', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1180, u'groups': u'sudo', u'home': u'/home/tdehler', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1180, '_ansible_item_label': {u'ovpn': u'tdehler@think 7e0WC4Vh86XWZviZ9WBMgw 4dc8477db6e4f40312e6b2b9db293dc009e49e518015ace20431c0fb69025461', u'name': u'tdehler'}, 'item': {u'ovpn': u'tdehler@think 7e0WC4Vh86XWZviZ9WBMgw 4dc8477db6e4f40312e6b2b9db293dc009e49e518015ace20431c0fb69025461', u'name': u'tdehler'}, u'changed': False, u'name': u'tdehler', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'tdehler', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1181, u'groups': u'sudo', u'home': u'/home/laura', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1181, '_ansible_item_label': {u'ovpn': u'laura@flab 2DGHIAxD39eNKOPkn3M67w ab1ae304abed3824a68b5c0ecf4f92fca76a4f8b9fcbcc0ca43388a85b7f9305', u'name': u'laura'}, 'item': {u'ovpn': u'laura@flab 2DGHIAxD39eNKOPkn3M67w ab1ae304abed3824a68b5c0ecf4f92fca76a4f8b9fcbcc0ca43388a85b7f9305', u'name': u'laura'}, u'changed': False, u'name': u'laura', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'laura', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1182, u'groups': u'sudo', u'home': u'/home/adamyanova', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1182, '_ansible_item_label': {u'ovpn': u'adamyanova@centos k2nfk8hmGW77nys3R/JkOg 7695b95c2f157d60622e0b0b7ab06fa2cb18661a190d839f7ea587bc44aa0e3c\nadamyanova@ubuntu FmlKgjVzPUxfNDnHeU9vLQ ef7d5524863dfa0787fc5e249873c1a5ea58e7fd5aee27e1d1d33d6f87388a2d\n', u'name': u'adamyanova'}, 'item': {u'ovpn': u'adamyanova@centos k2nfk8hmGW77nys3R/JkOg 7695b95c2f157d60622e0b0b7ab06fa2cb18661a190d839f7ea587bc44aa0e3c\nadamyanova@ubuntu FmlKgjVzPUxfNDnHeU9vLQ ef7d5524863dfa0787fc5e249873c1a5ea58e7fd5aee27e1d1d33d6f87388a2d\n', u'name': u'adamyanova'}, u'changed': False, u'name': u'adamyanova', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'adamyanova', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1183, u'groups': u'sudo', u'home': u'/home/yaarit', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1183, '_ansible_item_label': {u'ovpn': u'yaarit@centos TBuWjkAsj1GB/V9eWc/R1Q 7bd86a857dec48dc25850ecf0c00486d9a89c2ff5f88b2f28c3e36bdeb139fce', u'name': u'yaarit'}, 'item': {u'ovpn': u'yaarit@centos TBuWjkAsj1GB/V9eWc/R1Q 7bd86a857dec48dc25850ecf0c00486d9a89c2ff5f88b2f28c3e36bdeb139fce', u'name': u'yaarit'}, u'changed': False, u'name': u'yaarit', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yaarit', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1184, u'groups': u'sudo', u'home': u'/home/rpavani1998', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1184, '_ansible_item_label': {u'ovpn': u'admin1@rajula 31GbDo9d1YnW5BQ8u3utvw a2da13cb840f848846023c85442ba7bcce97dc186056a0ecc036a220d7eb7fc3', u'name': u'rpavani1998'}, 'item': {u'ovpn': u'admin1@rajula 31GbDo9d1YnW5BQ8u3utvw a2da13cb840f848846023c85442ba7bcce97dc186056a0ecc036a220d7eb7fc3', u'name': u'rpavani1998'}, u'changed': False, u'name': u'rpavani1998', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rpavani1998', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1185, u'groups': u'sudo', u'home': u'/home/rishabh', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1185, '_ansible_item_label': {u'ovpn': u'rishabh@p50 zdJ4XsBdVugwMrqJOSBi3Q c78bb28ba5cf2bf9c8edb80fe57814d60cd2ffdbd874cf9a271e5adf171bb0c4', u'name': u'rishabh'}, 'item': {u'ovpn': u'rishabh@p50 zdJ4XsBdVugwMrqJOSBi3Q c78bb28ba5cf2bf9c8edb80fe57814d60cd2ffdbd874cf9a271e5adf171bb0c4', u'name': u'rishabh'}, u'changed': False, u'name': u'rishabh', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rishabh', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1186, u'groups': u'sudo', u'home': u'/home/skrah', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1186, '_ansible_item_label': {u'ovpn': u'skrah@thinkpad IboPmnUdsLqqRXlHQ1RT5w bf85db9e916dceaf84a1e6ea33c59eb4adb424cb4e727ce0a903a3498b799ed2', u'name': u'skrah'}, 'item': {u'ovpn': u'skrah@thinkpad IboPmnUdsLqqRXlHQ1RT5w bf85db9e916dceaf84a1e6ea33c59eb4adb424cb4e727ce0a903a3498b799ed2', u'name': u'skrah'}, u'changed': False, u'name': u'skrah', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'skrah', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1187, u'groups': u'sudo', u'home': u'/home/smanjara', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1187, '_ansible_item_label': {u'ovpn': u'smanjara@fedora 5oorMoVYD3sT0nmOTBDh9w 83be007f68694c9463ef46e4ce223221d639d78f11d5b68449598de77e8e0ce8', u'name': u'smanjara'}, 'item': {u'ovpn': u'smanjara@fedora 5oorMoVYD3sT0nmOTBDh9w 83be007f68694c9463ef46e4ce223221d639d78f11d5b68449598de77e8e0ce8', u'name': u'smanjara'}, u'changed': False, u'name': u'smanjara', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'smanjara', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1188, u'groups': u'sudo', u'home': u'/home/bengland', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1188, '_ansible_item_label': {u'ovpn': u'bengland@bene-laptop ud1gWgoNggJTS7LQtPTZTA d3ebd084ce385cb450ce2f83c02dc66a1637dedbc7a8b191dab68acfc935af41', u'name': u'bengland'}, 'item': {u'ovpn': u'bengland@bene-laptop ud1gWgoNggJTS7LQtPTZTA d3ebd084ce385cb450ce2f83c02dc66a1637dedbc7a8b191dab68acfc935af41', u'name': u'bengland'}, u'changed': False, u'name': u'bengland', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'bengland', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1189, u'groups': u'sudo', u'home': u'/home/pnawracay', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1189, '_ansible_item_label': {u'ovpn': u'pnawracay@flab LIurEXLa7xbr+jzf2wRJVg 04062ba385602b385fd17f14de3a0cad83c685b8078fd2f18cc9ad77a4f4762d', u'name': u'pnawracay'}, 'item': {u'ovpn': u'pnawracay@flab LIurEXLa7xbr+jzf2wRJVg 04062ba385602b385fd17f14de3a0cad83c685b8078fd2f18cc9ad77a4f4762d', u'name': u'pnawracay'}, u'changed': False, u'name': u'pnawracay', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'pnawracay', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1190, u'groups': u'sudo', u'home': u'/home/alfonsomthd', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1190, '_ansible_item_label': {u'ovpn': u'alfonsomthd@localhost eUvx+7UEx5IYGmS0lNIscQ 2f1bfd4874280b9f525a46e93e767504df80a9b09a83a2fea387dcd6e34bc0f8', u'name': u'alfonsomthd'}, 'item': {u'ovpn': u'alfonsomthd@localhost eUvx+7UEx5IYGmS0lNIscQ 2f1bfd4874280b9f525a46e93e767504df80a9b09a83a2fea387dcd6e34bc0f8', u'name': u'alfonsomthd'}, u'changed': False, u'name': u'alfonsomthd', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'alfonsomthd', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1191, u'groups': u'sudo', u'home': u'/home/oliveiradan', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1191, '_ansible_item_label': {u'ovpn': u'oliveiradan@opensuse-dev twDqMYwYsdYmbgyCpUnElw ec3ee80ddc747c3ca5e1455a122279f8e1e642c5c09aa9c2ca7fec142f55089e', u'name': u'oliveiradan'}, 'item': {u'ovpn': u'oliveiradan@opensuse-dev twDqMYwYsdYmbgyCpUnElw ec3ee80ddc747c3ca5e1455a122279f8e1e642c5c09aa9c2ca7fec142f55089e', u'name': u'oliveiradan'}, u'changed': False, u'name': u'oliveiradan', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'oliveiradan', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1192, u'groups': u'sudo', u'home': u'/home/swagner', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1192, '_ansible_item_label': {u'ovpn': u'swagner@ubuntu 64V1h0Se0FmBQNH7KLibbQ ad7c91e9e2f7f3999492d5e41fbbc993327d37929bd09606227367d75e5556ba', u'name': u'swagner'}, 'item': {u'ovpn': u'swagner@ubuntu 64V1h0Se0FmBQNH7KLibbQ ad7c91e9e2f7f3999492d5e41fbbc993327d37929bd09606227367d75e5556ba', u'name': u'swagner'}, u'changed': False, u'name': u'swagner', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'swagner', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1193, u'groups': u'sudo', u'home': u'/home/yuvalif', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1193, '_ansible_item_label': {u'ovpn': u'ylifshit@localhost dyc2NU2pMz8NF8/uR2kMxA 3a6f1f9e55b5116f74d01ffbabdc339054088d257a16cf9fafcfe05b27fa678e', u'name': u'yuvalif'}, 'item': {u'ovpn': u'ylifshit@localhost dyc2NU2pMz8NF8/uR2kMxA 3a6f1f9e55b5116f74d01ffbabdc339054088d257a16cf9fafcfe05b27fa678e', u'name': u'yuvalif'}, u'changed': False, u'name': u'yuvalif', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'yuvalif', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1194, u'groups': u'sudo', u'home': u'/home/kkeithle', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1194, '_ansible_item_label': {u'ovpn': u'kkeithle@kkeithle.usersys.redhat.com FPnVevv1sp5hlWoJeDCe/g e5a1fa7ccf678b91ed570983d5420c98f109d507442c8e4dcd50803e0d71c852', u'name': u'kkeithle'}, 'item': {u'ovpn': u'kkeithle@kkeithle.usersys.redhat.com FPnVevv1sp5hlWoJeDCe/g e5a1fa7ccf678b91ed570983d5420c98f109d507442c8e4dcd50803e0d71c852', u'name': u'kkeithle'}, u'changed': False, u'name': u'kkeithle', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kkeithle', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1195, u'groups': u'sudo', u'home': u'/home/emmericp', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1195, '_ansible_item_label': {u'ovpn': u'paul.emmerich@croit.io RN4hOorvA7irUg/3ViM9KQ 3bd06194186d2624cadf255fa1c38ddf7dded0a6d83dc6001cd55fcc0a899130', u'name': u'emmericp'}, 'item': {u'ovpn': u'paul.emmerich@croit.io RN4hOorvA7irUg/3ViM9KQ 3bd06194186d2624cadf255fa1c38ddf7dded0a6d83dc6001cd55fcc0a899130', u'name': u'emmericp'}, u'changed': False, u'name': u'emmericp', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'emmericp', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1196, u'groups': u'sudo', u'home': u'/home/mchangir', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1196, '_ansible_item_label': {u'ovpn': u'mchangir@indraprastha junlGKNU/xzt4OIaGHKBLA e8c67fd935fca490af3fe17453ccae3176268c4bfe1db4a2a879a2ab7ea6bfa5', u'name': u'mchangir'}, 'item': {u'ovpn': u'mchangir@indraprastha junlGKNU/xzt4OIaGHKBLA e8c67fd935fca490af3fe17453ccae3176268c4bfe1db4a2a879a2ab7ea6bfa5', u'name': u'mchangir'}, u'changed': False, u'name': u'mchangir', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'mchangir', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1197, u'groups': u'sudo', u'home': u'/home/sidharthanup', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1197, '_ansible_item_label': {u'ovpn': u'sidharthanup@strawberryfields IZq91vqA+RG1Rtn3JFZb6Q a2873481cac9b8b4a0bd8bebe0248b3dccb370dd18b56a4dae713ca1fb0c4286', u'name': u'sidharthanup'}, 'item': {u'ovpn': u'sidharthanup@strawberryfields IZq91vqA+RG1Rtn3JFZb6Q a2873481cac9b8b4a0bd8bebe0248b3dccb370dd18b56a4dae713ca1fb0c4286', u'name': u'sidharthanup'}, u'changed': False, u'name': u'sidharthanup', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sidharthanup', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1198, u'groups': u'sudo', u'home': u'/home/varsha', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1198, '_ansible_item_label': {u'ovpn': u'varsha@local q7QjtBqj3duVVKubHLpzjw a358a0d6cd132a451a910abcbcf3070e4144c92638e0487622ae040a3410c07f', u'name': u'varsha'}, 'item': {u'ovpn': u'varsha@local q7QjtBqj3duVVKubHLpzjw a358a0d6cd132a451a910abcbcf3070e4144c92638e0487622ae040a3410c07f', u'name': u'varsha'}, u'changed': False, u'name': u'varsha', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'varsha', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1199, u'groups': u'sudo', u'home': u'/home/sjust', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1199, '_ansible_item_label': {u'ovpn': u'sjust@pondermatic sCd6606QpID1PnHn0AxFag 46da7d4c77cb1238f83a34f99e774707110f997d88f3a0fd240aac9b7b7bbc85\nsjust@rex004 YS0in6YtQJHx5aUo7ZHi8Q bdd5977d05171a365539b19fae283ec2e3c7389516664692b9bbbaf98c7b61f4\nsjust@office w19UilyC/xu7uCzv0DnWRg ab20efc114b769bf4c2cf313eb30db09c2e2f8234992f120cfc3d1b8b347ed3c\nsam@deepthought 44sCi+GEfY0zjKo5M/4FiQ ed1eedd14ca68116a2000477fa078f8f736d0a15640723c32204bb30f14cb888\n', u'name': u'sjust'}, 'item': {u'ovpn': u'sjust@pondermatic sCd6606QpID1PnHn0AxFag 46da7d4c77cb1238f83a34f99e774707110f997d88f3a0fd240aac9b7b7bbc85\nsjust@rex004 YS0in6YtQJHx5aUo7ZHi8Q bdd5977d05171a365539b19fae283ec2e3c7389516664692b9bbbaf98c7b61f4\nsjust@office w19UilyC/xu7uCzv0DnWRg ab20efc114b769bf4c2cf313eb30db09c2e2f8234992f120cfc3d1b8b347ed3c\nsam@deepthought 44sCi+GEfY0zjKo5M/4FiQ ed1eedd14ca68116a2000477fa078f8f736d0a15640723c32204bb30f14cb888\n', u'name': u'sjust'}, u'changed': False, u'name': u'sjust', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sjust', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1200, u'groups': u'sudo', u'home': u'/home/ideepika', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1200, '_ansible_item_label': {u'ovpn': u'deepika@asus hHsL1xugca0LzY52gKqfqQ 312e0f2680f72d9459c707fcd0ccfb777617f00017f0511839e9b7e3167d590f', u'name': u'ideepika'}, 'item': {u'ovpn': u'deepika@asus hHsL1xugca0LzY52gKqfqQ 312e0f2680f72d9459c707fcd0ccfb777617f00017f0511839e9b7e3167d590f', u'name': u'ideepika'}, u'changed': False, u'name': u'ideepika', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ideepika', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1201, u'groups': u'sudo', u'home': u'/home/gsalomon', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1201, '_ansible_item_label': {u'ovpn': u'gsalomon@home BrlCVxStNlEZGf5yYYAz1w 4b0c8d5b57dae1328c16d3017b59b632ccdfebe4135209fa97748c70ff00cc46', u'name': u'gsalomon'}, 'item': {u'ovpn': u'gsalomon@home BrlCVxStNlEZGf5yYYAz1w 4b0c8d5b57dae1328c16d3017b59b632ccdfebe4135209fa97748c70ff00cc46', u'name': u'gsalomon'}, u'changed': False, u'name': u'gsalomon', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'gsalomon', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1202, u'groups': u'sudo', u'home': u'/home/soumyakoduri', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1202, '_ansible_item_label': {u'ovpn': u'skoduri@rgw_vm fAskaxmJYtm4TDcModIkrQ 884b57c3b5d56493da361dc9f1cc4e06e766628fcb0f916090f2096edc5ce7de', u'name': u'soumyakoduri'}, 'item': {u'ovpn': u'skoduri@rgw_vm fAskaxmJYtm4TDcModIkrQ 884b57c3b5d56493da361dc9f1cc4e06e766628fcb0f916090f2096edc5ce7de', u'name': u'soumyakoduri'}, u'changed': False, u'name': u'soumyakoduri', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'soumyakoduri', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1203, u'groups': u'sudo', u'home': u'/home/kyr', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1203, '_ansible_item_label': {u'ovpn': u'kyr@suse Xu4+Yi98Il4ETOAav6okqA 03bc46e5fac6346cd82ff681f756860f98e0c61168633ce23325efde11a1964a', u'name': u'kyr'}, 'item': {u'ovpn': u'kyr@suse Xu4+Yi98Il4ETOAav6okqA 03bc46e5fac6346cd82ff681f756860f98e0c61168633ce23325efde11a1964a', u'name': u'kyr'}, u'changed': False, u'name': u'kyr', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'kyr', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1204, u'groups': u'sudo', u'home': u'/home/sseshasa', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1204, '_ansible_item_label': {u'ovpn': u'sseshasa@thinkpad jyB1pr0I3qsDknkTnJTMjg 72ac1456e344c22fd940d0ba0e035aa3819ef7cd3891e53308aa92ba2dec8849', u'name': u'sseshasa'}, 'item': {u'ovpn': u'sseshasa@thinkpad jyB1pr0I3qsDknkTnJTMjg 72ac1456e344c22fd940d0ba0e035aa3819ef7cd3891e53308aa92ba2dec8849', u'name': u'sseshasa'}, u'changed': False, u'name': u'sseshasa', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'sseshasa', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 1205, u'groups': u'sudo', u'home': u'/home/rfriedma', u'move_home': False, u'append': True, '_ansible_no_log': False, 'ansible_loop_var': u'item', u'group': 1205, '_ansible_item_label': {u'ovpn': u'rfriedma@rflap.redhat.com 5+OUPoyz8K0M0kcymdQOjA 40ce705001f31d7156c965228938cd4b02ae1a2c43dac1bbcd1b538e70312189', u'name': u'rfriedma'}, 'item': {u'ovpn': u'rfriedma@rflap.redhat.com 5+OUPoyz8K0M0kcymdQOjA 40ce705001f31d7156c965228938cd4b02ae1a2c43dac1bbcd1b538e70312189', u'name': u'rfriedma'}, u'changed': False, u'name': u'rfriedma', 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'rfriedma', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 10103, '_ansible_no_log': False, u'create_home': True, u'groups': u'sudo', u'home': u'/home/lmb', '_ansible_item_label': {u'ovpn': u'lmb@hermes LMS8kAikL0iqw2S6IbXa3Q f57a493b31e7ed02a2563dd4295278d4842dc698b4c635d011a8d2b4b1fd5c2b', u'name': u'lmb'}, 'ansible_loop_var': u'item', u'group': 10104, u'name': u'lmb', 'item': {u'ovpn': u'lmb@hermes LMS8kAikL0iqw2S6IbXa3Q f57a493b31e7ed02a2563dd4295278d4842dc698b4c635d011a8d2b4b1fd5c2b', u'name': u'lmb'}, u'changed': True, u'system': False, 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'lmb', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}, {u'comment': u'', u'shell': u'/bin/bash', u'uid': 10104, '_ansible_no_log': False, u'create_home': True, u'groups': u'sudo', u'home': u'/home/ksirivad', '_ansible_item_label': {u'ovpn': u'ksirivad@ksirivad prZzE08FqnY6hFtGk0Q6XQ 2ef0878b0050cf28775813fe0991f9a746c07e61920280ce29ee69088eda5efc', u'name': u'ksirivad'}, 'ansible_loop_var': u'item', u'group': 10105, u'name': u'ksirivad', 'item': {u'ovpn': u'ksirivad@ksirivad prZzE08FqnY6hFtGk0Q6XQ 2ef0878b0050cf28775813fe0991f9a746c07e61920280ce29ee69088eda5efc', u'name': u'ksirivad'}, u'changed': True, u'system': False, 'failed': False, u'state': u'present', u'invocation': {u'module_args': {u'comment': None, u'ssh_key_bits': 0, u'update_password': u'always', u'non_unique': False, u'force': False, u'skeleton': None, u'create_home': True, u'password_lock': None, u'ssh_key_passphrase': None, u'home': None, u'append': True, u'uid': None, u'ssh_key_comment': u'ansible-generated on smithi168.front.sepia.ceph.com', u'group': None, u'system': False, u'state': u'present', u'role': None, u'hidden': None, u'ssh_key_type': u'rsa', u'authorization': None, u'profile': None, u'shell': u'/bin/bash', u'expires': None, u'ssh_key_file': None, u'groups': [u'sudo'], u'move_home': False, u'password': None, u'name': u'ksirivad', u'local': None, u'seuser': None, u'remove': False, u'login_class': None, u'generate_ssh_key': None}}}]}}Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_ceph-cm-ansible_master/callback_plugins/failure_log.py", line 44, in log_failure log.error(yaml.safe_dump(failure)) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 309, in safe_dump return dump_all([data], stream, Dumper=SafeDumper, **kwds) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 281, in dump_all dumper.represent(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 29, in represent node = self.represent_data(data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 219, in represent_list return self.represent_sequence(u'tag:yaml.org,2002:seq', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 102, in represent_sequence node_item = self.represent_data(item) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 58, in represent_data node = self.yaml_representers[data_types[0]](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 227, in represent_dict return self.represent_mapping(u'tag:yaml.org,2002:map', data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 125, in represent_mapping node_value = self.represent_data(item_value) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 68, in represent_data node = self.yaml_representers[None](self, data) File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/virtualenv/local/lib/python2.7/site-packages/yaml/representer.py", line 251, in represent_undefined raise RepresenterError("cannot represent an object", data)RepresenterError: ('cannot represent an object', u'') |
||||||||||||||
fail | 4201218 | 2019-08-08 23:35:01 | 2019-08-10 21:46:00 | 2019-08-10 22:05:59 | 0:19:59 | 0:11:49 | 0:08:10 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi094 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201219 | 2019-08-08 23:35:02 | 2019-08-10 21:46:03 | 2019-08-11 00:20:04 | 2:34:01 | 2:11:26 | 0:22:35 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201220 | 2019-08-08 23:35:03 | 2019-08-10 21:46:42 | 2019-08-10 22:16:42 | 0:30:00 | 0:11:31 | 0:18:29 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201221 | 2019-08-08 23:35:04 | 2019-08-10 21:46:52 | 2019-08-10 22:06:51 | 0:19:59 | 0:06:48 | 0:13:11 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201222 | 2019-08-08 23:35:05 | 2019-08-10 21:46:59 | 2019-08-10 22:30:58 | 0:43:59 | 0:15:31 | 0:28:28 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi068 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201223 | 2019-08-08 23:35:05 | 2019-08-10 21:47:19 | 2019-08-10 22:37:19 | 0:50:00 | 0:16:51 | 0:33:09 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201224 | 2019-08-08 23:35:06 | 2019-08-10 21:47:28 | 2019-08-10 22:23:28 | 0:36:00 | 0:21:34 | 0:14:26 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4201225 | 2019-08-08 23:35:07 | 2019-08-10 21:49:21 | 2019-08-10 23:03:21 | 1:14:00 | 0:14:16 | 0:59:44 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi196 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201226 | 2019-08-08 23:35:08 | 2019-08-10 21:50:43 | 2019-08-10 22:40:43 | 0:50:00 | 0:17:17 | 0:32:43 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi183 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201227 | 2019-08-08 23:35:09 | 2019-08-10 21:51:09 | 2019-08-10 22:41:09 | 0:50:00 | 0:15:44 | 0:34:16 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201228 | 2019-08-08 23:35:10 | 2019-08-10 21:51:31 | 2019-08-10 23:01:31 | 1:10:00 | 0:16:01 | 0:53:59 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201229 | 2019-08-08 23:35:10 | 2019-08-10 21:52:24 | 2019-08-11 00:26:26 | 2:34:02 | 0:10:49 | 2:23:13 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi080 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201230 | 2019-08-08 23:35:11 | 2019-08-10 21:52:24 | 2019-08-11 00:38:26 | 2:46:02 | 0:28:09 | 2:17:53 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
fail | 4201231 | 2019-08-08 23:35:12 | 2019-08-10 21:52:27 | 2019-08-11 02:10:30 | 4:18:03 | 0:10:20 | 4:07:43 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi065 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201232 | 2019-08-08 23:35:13 | 2019-08-10 21:54:07 | 2019-08-10 22:20:06 | 0:25:59 | 0:12:47 | 0:13:12 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201233 | 2019-08-08 23:35:14 | 2019-08-10 21:54:35 | 2019-08-10 23:12:35 | 1:18:00 | 0:15:10 | 1:02:50 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201234 | 2019-08-08 23:35:15 | 2019-08-10 21:54:35 | 2019-08-10 23:04:35 | 1:10:00 | 0:54:14 | 0:15:46 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201235 | 2019-08-08 23:35:15 | 2019-08-10 21:55:21 | 2019-08-10 22:25:20 | 0:29:59 | 0:06:53 | 0:23:06 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi106 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201236 | 2019-08-08 23:35:16 | 2019-08-10 21:56:04 | 2019-08-10 22:32:04 | 0:36:00 | 0:14:45 | 0:21:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi093 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201237 | 2019-08-08 23:35:17 | 2019-08-10 21:56:57 | 2019-08-10 22:18:56 | 0:21:59 | 0:11:32 | 0:10:27 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201238 | 2019-08-08 23:35:18 | 2019-08-10 21:56:57 | 2019-08-10 23:04:57 | 1:08:00 | 0:56:07 | 0:11:53 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
pass | 4201239 | 2019-08-08 23:35:18 | 2019-08-10 21:57:10 | 2019-08-10 23:07:10 | 1:10:00 | 0:13:35 | 0:56:25 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
fail | 4201240 | 2019-08-08 23:35:19 | 2019-08-10 21:57:23 | 2019-08-10 22:33:22 | 0:35:59 | 0:15:13 | 0:20:46 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi006 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201241 | 2019-08-08 23:35:20 | 2019-08-10 21:58:15 | 2019-08-10 23:02:15 | 1:04:00 | 0:14:31 | 0:49:29 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi069 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201242 | 2019-08-08 23:35:21 | 2019-08-10 21:58:43 | 2019-08-10 23:14:43 | 1:16:00 | 0:48:53 | 0:27:07 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
pass | 4201243 | 2019-08-08 23:35:22 | 2019-08-10 21:59:49 | 2019-08-10 22:41:48 | 0:41:59 | 0:28:38 | 0:13:21 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
fail | 4201244 | 2019-08-08 23:35:22 | 2019-08-10 22:00:07 | 2019-08-10 22:22:06 | 0:21:59 | 0:11:07 | 0:10:52 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi094 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201245 | 2019-08-08 23:35:23 | 2019-08-10 22:01:58 | 2019-08-10 23:59:59 | 1:58:01 | 0:12:04 | 1:45:57 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201246 | 2019-08-08 23:35:24 | 2019-08-10 22:03:33 | 2019-08-10 23:29:33 | 1:26:00 | 0:06:53 | 1:19:07 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi067 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201247 | 2019-08-08 23:35:25 | 2019-08-10 22:03:52 | 2019-08-10 22:59:52 | 0:56:00 | 0:24:32 | 0:31:28 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4201248 | 2019-08-08 23:35:26 | 2019-08-10 22:05:11 | 2019-08-10 22:41:10 | 0:35:59 | 0:06:58 | 0:29:01 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201249 | 2019-08-08 23:35:26 | 2019-08-10 22:06:01 | 2019-08-10 23:06:01 | 1:00:00 | 0:11:54 | 0:48:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi086 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201250 | 2019-08-08 23:35:27 | 2019-08-10 22:06:27 | 2019-08-10 22:24:26 | 0:17:59 | 0:07:00 | 0:10:59 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi202 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201251 | 2019-08-08 23:35:28 | 2019-08-10 22:06:35 | 2019-08-10 22:56:34 | 0:49:59 | 0:26:56 | 0:23:03 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-10T22:42:29.794670+0000 mon.b (mon.0) 608 : cluster [WRN] Health check failed: 3 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4201252 | 2019-08-08 23:35:29 | 2019-08-10 22:06:53 | 2019-08-10 22:58:52 | 0:51:59 | 0:40:20 | 0:11:39 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4201253 | 2019-08-08 23:35:30 | 2019-08-10 22:08:48 | 2019-08-11 00:18:50 | 2:10:02 | 0:13:47 | 1:56:15 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi057 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201254 | 2019-08-08 23:35:30 | 2019-08-10 22:09:08 | 2019-08-10 23:01:08 | 0:52:00 | 0:14:42 | 0:37:18 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi041 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201255 | 2019-08-08 23:35:31 | 2019-08-10 22:09:33 | 2019-08-10 22:59:32 | 0:49:59 | 0:16:29 | 0:33:30 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4201256 | 2019-08-08 23:35:32 | 2019-08-10 22:10:17 | 2019-08-10 22:38:16 | 0:27:59 | 0:17:49 | 0:10:10 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi089 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201257 | 2019-08-08 23:35:33 | 2019-08-10 22:10:30 | 2019-08-10 22:46:29 | 0:35:59 | 0:16:35 | 0:19:24 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201258 | 2019-08-08 23:35:34 | 2019-08-10 22:11:02 | 2019-08-10 23:15:02 | 1:04:00 | 0:55:36 | 0:08:24 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201259 | 2019-08-08 23:35:34 | 2019-08-10 22:11:41 | 2019-08-10 22:37:40 | 0:25:59 | 0:16:51 | 0:09:08 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi205 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201260 | 2019-08-08 23:35:35 | 2019-08-10 22:12:58 | 2019-08-11 00:16:59 | 2:04:01 | 0:45:33 | 1:18:28 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4201261 | 2019-08-08 23:35:36 | 2019-08-10 22:12:58 | 2019-08-10 22:34:57 | 0:21:59 | 0:06:50 | 0:15:09 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201262 | 2019-08-08 23:35:37 | 2019-08-10 22:12:58 | 2019-08-10 23:02:57 | 0:49:59 | 0:15:21 | 0:34:38 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi059 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201263 | 2019-08-08 23:35:38 | 2019-08-10 22:13:01 | 2019-08-10 22:41:01 | 0:28:00 | 0:15:56 | 0:12:04 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi154 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201264 | 2019-08-08 23:35:38 | 2019-08-10 22:15:36 | 2019-08-10 22:55:36 | 0:40:00 | 0:11:10 | 0:28:50 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
pass | 4201265 | 2019-08-08 23:35:39 | 2019-08-10 22:15:40 | 2019-08-10 23:17:40 | 1:02:00 | 0:13:41 | 0:48:19 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
fail | 4201266 | 2019-08-08 23:35:40 | 2019-08-10 22:16:57 | 2019-08-10 22:40:56 | 0:23:59 | 0:14:49 | 0:09:10 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201267 | 2019-08-08 23:35:41 | 2019-08-10 22:19:12 | 2019-08-10 22:43:11 | 0:23:59 | 0:13:36 | 0:10:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi094 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201268 | 2019-08-08 23:35:42 | 2019-08-10 22:19:12 | 2019-08-11 00:29:13 | 2:10:01 | 1:56:49 | 0:13:12 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201269 | 2019-08-08 23:35:42 | 2019-08-10 22:19:12 | 2019-08-10 22:45:11 | 0:25:59 | 0:12:53 | 0:13:06 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201270 | 2019-08-08 23:35:43 | 2019-08-10 22:20:08 | 2019-08-10 23:00:07 | 0:39:59 | 0:06:49 | 0:33:10 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi033 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201271 | 2019-08-08 23:35:44 | 2019-08-10 22:20:37 | 2019-08-10 23:26:37 | 1:06:00 | 0:36:36 | 0:29:24 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4201272 | 2019-08-08 23:35:45 | 2019-08-10 22:20:38 | 2019-08-10 23:04:38 | 0:44:00 | 0:12:27 | 0:31:33 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi120 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201273 | 2019-08-08 23:35:45 | 2019-08-10 22:21:02 | 2019-08-10 23:13:01 | 0:51:59 | 0:21:00 | 0:30:59 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4201274 | 2019-08-08 23:35:46 | 2019-08-10 22:22:22 | 2019-08-10 23:08:21 | 0:45:59 | 0:11:38 | 0:34:21 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201275 | 2019-08-08 23:35:47 | 2019-08-10 22:22:25 | 2019-08-11 00:00:26 | 1:38:01 | 0:56:13 | 0:41:48 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
fail | 4201276 | 2019-08-08 23:35:48 | 2019-08-10 22:22:26 | 2019-08-10 22:52:25 | 0:29:59 | 0:11:56 | 0:18:03 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201277 | 2019-08-08 23:35:49 | 2019-08-10 22:23:42 | 2019-08-10 22:51:42 | 0:28:00 | 0:16:28 | 0:11:32 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201278 | 2019-08-08 23:35:50 | 2019-08-10 22:24:28 | 2019-08-10 23:12:27 | 0:47:59 | 0:10:13 | 0:37:46 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi079 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201279 | 2019-08-08 23:35:50 | 2019-08-10 22:24:30 | 2019-08-10 22:58:29 | 0:33:59 | 0:18:21 | 0:15:38 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
fail | 4201280 | 2019-08-08 23:35:51 | 2019-08-10 22:25:13 | 2019-08-11 00:41:19 | 2:16:06 | 0:13:10 | 2:02:56 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi059 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201281 | 2019-08-08 23:35:52 | 2019-08-10 22:25:22 | 2019-08-11 00:35:23 | 2:10:01 | 0:13:35 | 1:56:26 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201282 | 2019-08-08 23:35:53 | 2019-08-10 22:27:07 | 2019-08-10 23:21:07 | 0:54:00 | 0:16:00 | 0:38:00 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201283 | 2019-08-08 23:35:54 | 2019-08-10 22:27:07 | 2019-08-11 00:05:08 | 1:38:01 | 0:57:26 | 0:40:35 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201284 | 2019-08-08 23:35:54 | 2019-08-10 22:28:24 | 2019-08-11 00:38:25 | 2:10:01 | 0:06:51 | 2:03:10 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi136 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201285 | 2019-08-08 23:35:55 | 2019-08-10 22:30:47 | 2019-08-10 22:56:46 | 0:25:59 | 0:12:57 | 0:13:02 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi132 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201286 | 2019-08-08 23:35:56 | 2019-08-10 22:31:00 | 2019-08-10 22:58:59 | 0:27:59 | 0:06:57 | 0:21:02 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201287 | 2019-08-08 23:35:57 | 2019-08-10 22:32:19 | 2019-08-11 00:14:20 | 1:42:01 | 1:06:06 | 0:35:55 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4201288 | 2019-08-08 23:35:58 | 2019-08-10 22:33:37 | 2019-08-10 23:01:37 | 0:28:00 | 0:15:19 | 0:12:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi159 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201289 | 2019-08-08 23:35:58 | 2019-08-10 22:33:59 | 2019-08-10 22:55:59 | 0:22:00 | 0:12:45 | 0:09:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi204 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201290 | 2019-08-08 23:35:59 | 2019-08-10 22:35:01 | 2019-08-10 23:01:00 | 0:25:59 | 0:14:33 | 0:11:26 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201291 | 2019-08-08 23:36:00 | 2019-08-10 22:35:01 | 2019-08-11 00:37:02 | 2:02:01 | 0:51:56 | 1:10:05 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4201292 | 2019-08-08 23:36:01 | 2019-08-10 22:37:34 | 2019-08-10 23:15:34 | 0:38:00 | 0:06:53 | 0:31:07 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi102 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201293 | 2019-08-08 23:36:02 | 2019-08-10 22:37:42 | 2019-08-11 00:15:42 | 1:38:00 | 0:13:18 | 1:24:42 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi065 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201294 | 2019-08-08 23:36:02 | 2019-08-10 22:37:47 | 2019-08-10 23:25:47 | 0:48:00 | 0:11:40 | 0:36:20 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201295 | 2019-08-08 23:36:03 | 2019-08-10 22:38:03 | 2019-08-10 22:58:02 | 0:19:59 | 0:06:49 | 0:13:10 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi052 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201296 | 2019-08-08 23:36:04 | 2019-08-10 22:38:20 | 2019-08-10 23:28:19 | 0:49:59 | 0:26:22 | 0:23:37 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
pass | 4201297 | 2019-08-08 23:36:05 | 2019-08-10 22:40:53 | 2019-08-10 23:22:52 | 0:41:59 | 0:12:29 | 0:29:30 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4201298 | 2019-08-08 23:36:06 | 2019-08-10 22:40:53 | 2019-08-11 00:16:59 | 1:36:06 | 0:13:29 | 1:22:37 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi191 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201299 | 2019-08-08 23:36:06 | 2019-08-10 22:40:58 | 2019-08-10 23:08:57 | 0:27:59 | 0:07:03 | 0:20:56 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi153 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201300 | 2019-08-08 23:36:07 | 2019-08-10 22:41:02 | 2019-08-11 00:21:03 | 1:40:01 | 0:22:18 | 1:17:43 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-11T00:12:51.653779+0000 mon.a (mon.0) 880 : cluster [WRN] Health check failed: 4 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4201301 | 2019-08-08 23:36:08 | 2019-08-10 22:41:11 | 2019-08-10 23:35:11 | 0:54:00 | 0:14:37 | 0:39:23 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi198 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201302 | 2019-08-08 23:36:09 | 2019-08-10 22:41:12 | 2019-08-10 23:29:11 | 0:47:59 | 0:12:25 | 0:35:34 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201303 | 2019-08-08 23:36:10 | 2019-08-10 22:41:51 | 2019-08-10 23:21:50 | 0:39:59 | 0:10:55 | 0:29:04 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi120 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201304 | 2019-08-08 23:36:11 | 2019-08-10 22:41:51 | 2019-08-10 23:11:50 | 0:29:59 | 0:16:13 | 0:13:46 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4201305 | 2019-08-08 23:36:11 | 2019-08-10 22:43:24 | 2019-08-10 23:17:23 | 0:33:59 | 0:06:56 | 0:27:03 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi112 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201306 | 2019-08-08 23:36:12 | 2019-08-10 22:44:41 | 2019-08-10 23:58:41 | 1:14:00 | 0:19:17 | 0:54:43 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201307 | 2019-08-08 23:36:13 | 2019-08-10 22:44:45 | 2019-08-10 23:20:44 | 0:35:59 | 0:15:24 | 0:20:35 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201308 | 2019-08-08 23:36:14 | 2019-08-10 22:45:12 | 2019-08-10 23:43:12 | 0:58:00 | 0:11:50 | 0:46:10 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi166 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201309 | 2019-08-08 23:36:15 | 2019-08-10 22:46:03 | 2019-08-10 23:44:02 | 0:57:59 | 0:32:07 | 0:25:52 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
pass | 4201310 | 2019-08-08 23:36:15 | 2019-08-10 22:46:31 | 2019-08-10 23:22:30 | 0:35:59 | 0:15:31 | 0:20:28 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4201311 | 2019-08-08 23:36:16 | 2019-08-10 22:46:49 | 2019-08-10 23:26:48 | 0:39:59 | 0:12:35 | 0:27:24 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201312 | 2019-08-08 23:36:17 | 2019-08-10 22:46:58 | 2019-08-10 23:14:57 | 0:27:59 | 0:10:28 | 0:17:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi114 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201313 | 2019-08-08 23:36:18 | 2019-08-10 22:48:12 | 2019-08-10 23:24:17 | 0:36:05 | 0:12:21 | 0:23:44 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
pass | 4201314 | 2019-08-08 23:36:19 | 2019-08-10 22:51:44 | 2019-08-11 00:25:44 | 1:34:00 | 0:13:43 | 1:20:17 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
fail | 4201315 | 2019-08-08 23:36:20 | 2019-08-10 22:52:27 | 2019-08-11 00:16:27 | 1:24:00 | 0:13:47 | 1:10:13 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi202 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201316 | 2019-08-08 23:36:20 | 2019-08-10 22:53:59 | 2019-08-10 23:59:59 | 1:06:00 | 0:10:19 | 0:55:41 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi137 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201317 | 2019-08-08 23:36:21 | 2019-08-10 22:54:43 | 2019-08-11 01:32:45 | 2:38:02 | 1:56:28 | 0:41:34 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201318 | 2019-08-08 23:36:22 | 2019-08-10 22:55:35 | 2019-08-11 00:05:35 | 1:10:00 | 0:11:21 | 0:58:39 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201319 | 2019-08-08 23:36:23 | 2019-08-10 22:55:37 | 2019-08-10 23:13:37 | 0:18:00 | 0:06:50 | 0:11:10 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi176 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201320 | 2019-08-08 23:36:24 | 2019-08-10 22:56:00 | 2019-08-11 01:44:02 | 2:48:02 | 1:03:19 | 1:44:43 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
fail | 4201321 | 2019-08-08 23:36:24 | 2019-08-10 22:56:02 | 2019-08-10 23:18:02 | 0:22:00 | 0:10:48 | 0:11:12 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi181 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201322 | 2019-08-08 23:36:25 | 2019-08-10 22:56:51 | 2019-08-10 23:48:51 | 0:52:00 | 0:21:23 | 0:30:37 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
pass | 4201323 | 2019-08-08 23:36:26 | 2019-08-10 22:56:51 | 2019-08-11 04:18:56 | 5:22:05 | 0:18:07 | 5:03:58 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
fail | 4201324 | 2019-08-08 23:36:27 | 2019-08-10 22:58:17 | 2019-08-10 23:32:16 | 0:33:59 | 0:14:27 | 0:19:32 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi091 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201325 | 2019-08-08 23:36:27 | 2019-08-10 22:58:31 | 2019-08-11 00:34:31 | 1:36:00 | 0:12:13 | 1:23:47 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi205 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201326 | 2019-08-08 23:36:28 | 2019-08-10 22:58:48 | 2019-08-11 00:52:49 | 1:54:01 | 0:17:54 | 1:36:07 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201327 | 2019-08-08 23:36:29 | 2019-08-10 22:58:54 | 2019-08-10 23:22:53 | 0:23:59 | 0:10:52 | 0:13:07 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi166 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201328 | 2019-08-08 23:36:30 | 2019-08-10 22:59:01 | 2019-08-10 23:19:00 | 0:19:59 | 0:07:01 | 0:12:58 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201329 | 2019-08-08 23:36:31 | 2019-08-10 22:59:47 | 2019-08-10 23:17:47 | 0:18:00 | 0:10:43 | 0:07:17 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201330 | 2019-08-08 23:36:31 | 2019-08-10 22:59:53 | 2019-08-10 23:55:53 | 0:56:00 | 0:12:45 | 0:43:15 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201331 | 2019-08-08 23:36:32 | 2019-08-10 23:00:09 | 2019-08-10 23:34:08 | 0:33:59 | 0:20:24 | 0:13:35 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201332 | 2019-08-08 23:36:33 | 2019-08-10 23:01:15 | 2019-08-11 00:29:16 | 1:28:01 | 0:16:10 | 1:11:51 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201333 | 2019-08-08 23:36:34 | 2019-08-10 23:01:15 | 2019-08-11 01:25:17 | 2:24:02 | 0:22:47 | 2:01:15 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
"2019-08-11T01:16:54.302508+0000 mon.a (mon.0) 812 : cluster [WRN] Health check failed: 3 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
fail | 4201334 | 2019-08-08 23:36:35 | 2019-08-10 23:01:32 | 2019-08-10 23:41:32 | 0:40:00 | 0:13:10 | 0:26:50 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi191 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201335 | 2019-08-08 23:36:36 | 2019-08-10 23:01:38 | 2019-08-11 01:21:40 | 2:20:02 | 0:11:28 | 2:08:34 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi026 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201336 | 2019-08-08 23:36:36 | 2019-08-10 23:02:30 | 2019-08-11 00:16:30 | 1:14:00 | 0:50:26 | 0:23:34 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
pass | 4201337 | 2019-08-08 23:36:37 | 2019-08-10 23:02:59 | 2019-08-10 23:38:58 | 0:35:59 | 0:14:33 | 0:21:26 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
fail | 4201338 | 2019-08-08 23:36:38 | 2019-08-10 23:03:22 | 2019-08-11 00:53:23 | 1:50:01 | 0:10:28 | 1:39:33 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi134 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201339 | 2019-08-08 23:36:39 | 2019-08-10 23:04:51 | 2019-08-10 23:52:51 | 0:48:00 | 0:10:56 | 0:37:04 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi093 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201340 | 2019-08-08 23:36:40 | 2019-08-10 23:04:51 | 2019-08-11 00:08:51 | 1:04:00 | 0:51:55 | 0:12:05 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4201341 | 2019-08-08 23:36:41 | 2019-08-10 23:04:58 | 2019-08-10 23:36:58 | 0:32:00 | 0:14:26 | 0:17:34 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi179 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201342 | 2019-08-08 23:36:42 | 2019-08-10 23:06:18 | 2019-08-10 23:34:17 | 0:27:59 | 0:13:28 | 0:14:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi113 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201343 | 2019-08-08 23:36:42 | 2019-08-10 23:06:19 | 2019-08-10 23:30:19 | 0:24:00 | 0:11:21 | 0:12:39 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201344 | 2019-08-08 23:36:43 | 2019-08-10 23:07:12 | 2019-08-11 01:17:13 | 2:10:01 | 0:06:47 | 2:03:14 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi069 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201345 | 2019-08-08 23:36:44 | 2019-08-10 23:08:37 | 2019-08-10 23:54:37 | 0:46:00 | 0:31:40 | 0:14:20 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4201346 | 2019-08-08 23:36:45 | 2019-08-10 23:08:59 | 2019-08-10 23:40:58 | 0:31:59 | 0:14:30 | 0:17:29 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi202 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201347 | 2019-08-08 23:36:46 | 2019-08-10 23:11:01 | 2019-08-10 23:41:00 | 0:29:59 | 0:13:10 | 0:16:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi133 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201348 | 2019-08-08 23:36:46 | 2019-08-10 23:11:52 | 2019-08-11 00:25:52 | 1:14:00 | 0:06:46 | 1:07:14 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi169 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201349 | 2019-08-08 23:36:47 | 2019-08-10 23:12:43 | 2019-08-11 00:16:43 | 1:04:00 | 0:26:38 | 0:37:22 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-11T00:03:55.245681+0000 mon.b (mon.0) 634 : cluster [WRN] Health check failed: 3 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4201350 | 2019-08-08 23:36:48 | 2019-08-10 23:12:43 | 2019-08-11 00:00:43 | 0:48:00 | 0:37:00 | 0:11:00 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4201351 | 2019-08-08 23:36:49 | 2019-08-10 23:13:03 | 2019-08-10 23:47:02 | 0:33:59 | 0:11:08 | 0:22:51 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi136 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201352 | 2019-08-08 23:36:50 | 2019-08-10 23:13:38 | 2019-08-10 23:39:37 | 0:25:59 | 0:13:28 | 0:12:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi139 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201353 | 2019-08-08 23:36:50 | 2019-08-10 23:14:58 | 2019-08-10 23:52:58 | 0:38:00 | 0:14:02 | 0:23:58 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
pass | 4201354 | 2019-08-08 23:36:51 | 2019-08-10 23:14:59 | 2019-08-10 23:52:58 | 0:37:59 | 0:28:24 | 0:09:35 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
pass | 4201355 | 2019-08-08 23:36:52 | 2019-08-10 23:15:04 | 2019-08-11 00:45:09 | 1:30:05 | 0:19:55 | 1:10:10 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201356 | 2019-08-08 23:36:53 | 2019-08-10 23:15:35 | 2019-08-10 23:57:35 | 0:42:00 | 0:16:58 | 0:25:02 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201357 | 2019-08-08 23:36:54 | 2019-08-10 23:15:53 | 2019-08-11 00:19:53 | 1:04:00 | 0:13:05 | 0:50:55 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi069 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201358 | 2019-08-08 23:36:54 | 2019-08-10 23:17:05 | 2019-08-11 00:11:05 | 0:54:00 | 0:37:52 | 0:16:08 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4201359 | 2019-08-08 23:36:55 | 2019-08-10 23:17:24 | 2019-08-10 23:41:24 | 0:24:00 | 0:14:13 | 0:09:47 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi121 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201360 | 2019-08-08 23:36:56 | 2019-08-10 23:17:42 | 2019-08-10 23:41:41 | 0:23:59 | 0:12:39 | 0:11:20 | smithi | master | ubuntu | 18.04 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
fail | 4201361 | 2019-08-08 23:36:57 | 2019-08-10 23:17:48 | 2019-08-11 00:53:49 | 1:36:01 | 0:10:32 | 1:25:29 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi101 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201362 | 2019-08-08 23:36:58 | 2019-08-10 23:18:03 | 2019-08-11 01:02:04 | 1:44:01 | 0:11:42 | 1:32:19 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201363 | 2019-08-08 23:36:58 | 2019-08-10 23:19:16 | 2019-08-11 00:29:16 | 1:10:00 | 0:06:50 | 1:03:10 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi083 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201364 | 2019-08-08 23:36:59 | 2019-08-10 23:20:59 | 2019-08-10 23:56:59 | 0:36:00 | 0:10:43 | 0:25:17 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi191 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201365 | 2019-08-08 23:37:00 | 2019-08-10 23:21:08 | 2019-08-11 00:23:08 | 1:02:00 | 0:11:55 | 0:50:05 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi181 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201366 | 2019-08-08 23:37:01 | 2019-08-10 23:21:52 | 2019-08-11 01:39:53 | 2:18:01 | 2:05:09 | 0:12:52 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201367 | 2019-08-08 23:37:02 | 2019-08-10 23:22:34 | 2019-08-11 00:00:34 | 0:38:00 | 0:11:19 | 0:26:41 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201368 | 2019-08-08 23:37:02 | 2019-08-10 23:22:54 | 2019-08-11 00:36:54 | 1:14:00 | 0:06:54 | 1:07:06 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi161 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201369 | 2019-08-08 23:37:03 | 2019-08-10 23:22:55 | 2019-08-10 23:48:54 | 0:25:59 | 0:12:21 | 0:13:38 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi156 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201370 | 2019-08-08 23:37:04 | 2019-08-10 23:23:24 | 2019-08-11 01:03:25 | 1:40:01 | 0:10:36 | 1:29:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi012 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201371 | 2019-08-08 23:37:05 | 2019-08-10 23:24:21 | 2019-08-11 00:06:21 | 0:42:00 | 0:21:14 | 0:20:46 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4201372 | 2019-08-08 23:37:06 | 2019-08-10 23:25:48 | 2019-08-11 00:15:48 | 0:50:00 | 0:14:29 | 0:35:31 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi197 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201373 | 2019-08-08 23:37:07 | 2019-08-10 23:26:19 | 2019-08-11 00:04:18 | 0:37:59 | 0:06:45 | 0:31:14 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi059 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201374 | 2019-08-08 23:37:07 | 2019-08-10 23:26:53 | 2019-08-11 00:18:53 | 0:52:00 | 0:13:25 | 0:38:35 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi116 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201375 | 2019-08-08 23:37:08 | 2019-08-10 23:26:53 | 2019-08-11 00:24:53 | 0:58:00 | 0:16:40 | 0:41:20 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201376 | 2019-08-08 23:37:09 | 2019-08-10 23:28:35 | 2019-08-11 00:38:35 | 1:10:00 | 0:13:29 | 0:56:31 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi191 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201377 | 2019-08-08 23:37:10 | 2019-08-10 23:29:13 | 2019-08-10 23:55:12 | 0:25:59 | 0:11:53 | 0:14:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi019 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201378 | 2019-08-08 23:37:11 | 2019-08-10 23:29:49 | 2019-08-11 00:57:49 | 1:28:00 | 0:10:54 | 1:17:06 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi196 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201379 | 2019-08-08 23:37:11 | 2019-08-10 23:30:20 | 2019-08-11 00:04:19 | 0:33:59 | 0:12:42 | 0:21:17 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |||
pass | 4201380 | 2019-08-08 23:37:12 | 2019-08-10 23:30:22 | 2019-08-11 00:04:21 | 0:33:59 | 0:19:15 | 0:14:44 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201381 | 2019-08-08 23:37:13 | 2019-08-10 23:31:21 | 2019-08-11 01:03:22 | 1:32:01 | 0:16:53 | 1:15:08 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201382 | 2019-08-08 23:37:14 | 2019-08-10 23:32:18 | 2019-08-11 00:50:18 | 1:18:00 | 0:06:49 | 1:11:11 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi202 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201383 | 2019-08-08 23:37:15 | 2019-08-10 23:34:09 | 2019-08-11 03:10:12 | 3:36:03 | 0:10:04 | 3:25:59 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
Failure Reason:
Command failed on smithi018 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201384 | 2019-08-08 23:37:15 | 2019-08-10 23:34:10 | 2019-08-11 00:26:09 | 0:51:59 | 0:11:47 | 0:40:12 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi204 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201385 | 2019-08-08 23:37:16 | 2019-08-10 23:34:19 | 2019-08-11 02:10:20 | 2:36:01 | 1:21:17 | 1:14:44 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |||
fail | 4201386 | 2019-08-08 23:37:17 | 2019-08-10 23:34:21 | 2019-08-11 00:20:20 | 0:45:59 | 0:07:01 | 0:38:58 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201387 | 2019-08-08 23:37:18 | 2019-08-10 23:34:44 | 2019-08-10 23:56:43 | 0:21:59 | 0:11:02 | 0:10:57 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi018 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201388 | 2019-08-08 23:37:19 | 2019-08-10 23:35:26 | 2019-08-11 00:37:26 | 1:02:00 | 0:13:22 | 0:48:38 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi143 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201389 | 2019-08-08 23:37:19 | 2019-08-10 23:37:12 | 2019-08-11 00:39:13 | 1:02:01 | 0:50:24 | 0:11:37 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |||
fail | 4201390 | 2019-08-08 23:37:20 | 2019-08-10 23:37:13 | 2019-08-11 00:09:12 | 0:31:59 | 0:06:49 | 0:25:10 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi093 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201391 | 2019-08-08 23:37:21 | 2019-08-10 23:39:13 | 2019-08-10 23:59:12 | 0:19:59 | 0:10:33 | 0:09:26 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi143 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201392 | 2019-08-08 23:37:22 | 2019-08-10 23:39:39 | 2019-08-11 00:43:39 | 1:04:00 | 0:11:50 | 0:52:10 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201393 | 2019-08-08 23:37:22 | 2019-08-10 23:41:00 | 2019-08-11 00:08:59 | 0:27:59 | 0:06:48 | 0:21:11 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi079 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201394 | 2019-08-08 23:37:23 | 2019-08-10 23:41:01 | 2019-08-11 00:39:01 | 0:58:00 | 0:35:34 | 0:22:26 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |||
fail | 4201395 | 2019-08-08 23:37:24 | 2019-08-10 23:41:25 | 2019-08-11 00:41:25 | 1:00:00 | 0:13:53 | 0:46:07 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi200 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201396 | 2019-08-08 23:37:25 | 2019-08-10 23:41:33 | 2019-08-11 00:11:33 | 0:30:00 | 0:11:52 | 0:18:08 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201397 | 2019-08-08 23:37:26 | 2019-08-10 23:41:53 | 2019-08-11 00:11:52 | 0:29:59 | 0:12:44 | 0:17:15 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi041 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201398 | 2019-08-08 23:37:26 | 2019-08-10 23:43:27 | 2019-08-11 00:29:27 | 0:46:00 | 0:22:20 | 0:23:40 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |||
Failure Reason:
"2019-08-11T00:20:03.530065+0000 mon.b (mon.0) 844 : cluster [WRN] Health check failed: 7 daemons have recently crashed (RECENT_CRASH)" in cluster log |
||||||||||||||
pass | 4201399 | 2019-08-08 23:37:27 | 2019-08-10 23:44:04 | 2019-08-11 00:36:04 | 0:52:00 | 0:37:51 | 0:14:09 | smithi | master | ubuntu | 18.04 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{ubuntu_latest.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |
fail | 4201400 | 2019-08-08 23:37:28 | 2019-08-10 23:47:17 | 2019-08-11 00:23:17 | 0:36:00 | 0:12:04 | 0:23:56 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi140 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201401 | 2019-08-08 23:37:29 | 2019-08-10 23:49:00 | 2019-08-11 01:01:00 | 1:12:00 | 0:10:36 | 1:01:24 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi074 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201402 | 2019-08-08 23:37:29 | 2019-08-10 23:49:01 | 2019-08-11 01:05:01 | 1:16:00 | 0:15:31 | 1:00:29 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |||
fail | 4201403 | 2019-08-08 23:37:30 | 2019-08-10 23:49:00 | 2019-08-11 00:33:00 | 0:44:00 | 0:06:53 | 0:37:07 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi139 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201404 | 2019-08-08 23:37:31 | 2019-08-10 23:51:28 | 2019-08-11 00:59:28 | 1:08:00 | 0:19:55 | 0:48:05 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
pass | 4201405 | 2019-08-08 23:37:32 | 2019-08-10 23:53:06 | 2019-08-11 02:17:08 | 2:24:02 | 0:18:33 | 2:05:29 | smithi | master | multimds/verify/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{kernel-testing.yaml mount.yaml ms-die-on-skipped.yaml} objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_fsstress.yaml validater/lockdep.yaml} | 3 | |||
fail | 4201406 | 2019-08-08 23:37:33 | 2019-08-10 23:53:07 | 2019-08-11 00:43:06 | 0:49:59 | 0:11:19 | 0:38:40 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi079 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201407 | 2019-08-08 23:37:34 | 2019-08-10 23:53:07 | 2019-08-11 00:59:06 | 1:05:59 | 0:35:57 | 0:30:02 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |||
fail | 4201408 | 2019-08-08 23:37:34 | 2019-08-10 23:54:52 | 2019-08-11 02:12:53 | 2:18:01 | 0:06:57 | 2:11:04 | smithi | master | centos | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi089 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201409 | 2019-08-08 23:37:35 | 2019-08-10 23:55:13 | 2019-08-11 00:53:13 | 0:58:00 | 0:07:05 | 0:50:55 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi161 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201410 | 2019-08-08 23:37:36 | 2019-08-10 23:56:08 | 2019-08-11 00:38:08 | 0:42:00 | 0:13:35 | 0:28:25 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi047 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201411 | 2019-08-08 23:37:37 | 2019-08-10 23:56:45 | 2019-08-11 00:20:44 | 0:23:59 | 0:12:12 | 0:11:47 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201412 | 2019-08-08 23:37:37 | 2019-08-10 23:57:01 | 2019-08-11 01:27:01 | 1:30:00 | 0:11:07 | 1:18:53 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_exports.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201413 | 2019-08-08 23:37:38 | 2019-08-10 23:57:50 | 2019-08-11 00:43:49 | 0:45:59 | 0:11:06 | 0:34:53 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201414 | 2019-08-08 23:37:39 | 2019-08-10 23:58:42 | 2019-08-11 01:58:45 | 2:00:03 | 0:10:02 | 1:50:01 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cephfs_test_snapshots.yaml} | 3 | |
Failure Reason:
Command failed on smithi053 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201415 | 2019-08-08 23:37:40 | 2019-08-10 23:59:26 | 2019-08-11 02:11:28 | 2:12:02 | 1:59:35 | 0:12:27 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_kernel_untar_build.yaml} | 3 | |||
pass | 4201416 | 2019-08-08 23:37:41 | 2019-08-11 00:00:00 | 2019-08-11 00:32:00 | 0:32:00 | 0:11:34 | 0:20:26 | smithi | master | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/none.yaml objectstore-ec/bluestore-comp.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |||
fail | 4201417 | 2019-08-08 23:37:41 | 2019-08-11 00:00:01 | 2019-08-11 00:52:00 | 0:51:59 | 0:06:49 | 0:45:10 | smithi | master | centos | multimds/verify/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml verify/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml}} tasks/cfuse_workunit_suites_dbench.yaml validater/valgrind.yaml} | 3 | ||
Failure Reason:
Command failed on smithi050 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse ceph-debuginfo python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201418 | 2019-08-08 23:37:42 | 2019-08-11 00:00:40 | 2019-08-11 00:40:39 | 0:39:59 | 0:14:10 | 0:25:49 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_misc.yaml} | 3 | |
Failure Reason:
Command failed on smithi163 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201419 | 2019-08-08 23:37:43 | 2019-08-11 00:00:40 | 2019-08-11 01:06:45 | 1:06:05 | 0:10:34 | 0:55:31 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_norstats.yaml} | 3 | |
Failure Reason:
Command failed on smithi202 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201420 | 2019-08-08 23:37:44 | 2019-08-11 00:00:44 | 2019-08-11 01:50:45 | 1:50:01 | 0:21:22 | 1:28:39 | smithi | master | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_blogbench.yaml} | 3 | |||
fail | 4201421 | 2019-08-08 23:37:45 | 2019-08-11 00:04:34 | 2019-08-11 00:32:33 | 0:27:59 | 0:06:57 | 0:21:02 | smithi | master | centos | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mon.yaml clusters/9-mds-3-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{centos_7.yaml}} ms-die-on-skipped.yaml}} msgr-failures/osd-mds-delay.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |
Failure Reason:
Command failed on smithi197 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201422 | 2019-08-08 23:37:46 | 2019-08-11 00:04:34 | 2019-08-11 01:16:34 | 1:12:00 | 0:11:13 | 1:00:47 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-bitmap.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_dbench.yaml} | 3 | |
Failure Reason:
Command failed on smithi086 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201423 | 2019-08-08 23:37:46 | 2019-08-11 00:04:34 | 2019-08-11 05:40:39 | 5:36:05 | 0:10:14 | 5:25:51 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-comp-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_ffsb.yaml} | 3 | |
Failure Reason:
Command failed on smithi195 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
pass | 4201424 | 2019-08-08 23:37:47 | 2019-08-11 00:05:09 | 2019-08-11 00:55:09 | 0:50:00 | 0:15:52 | 0:34:08 | smithi | master | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsstress.yaml} | 3 | |||
fail | 4201425 | 2019-08-08 23:37:48 | 2019-08-11 00:05:50 | 2019-08-11 00:33:50 | 0:28:00 | 0:12:21 | 0:15:39 | smithi | master | rhel | 7.6 | multimds/thrash/{begin.yaml ceph-thrash/mds.yaml clusters/3-mds-2-standby.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} msgr-failures/none.yaml objectstore-ec/filestore-xfs.yaml overrides/{fuse-default-perm-no.yaml thrash/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} thrash_debug.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi022 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201426 | 2019-08-08 23:37:49 | 2019-08-11 00:06:22 | 2019-08-11 02:32:24 | 2:26:02 | 0:10:55 | 2:15:07 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/9-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/kclient/{mount.yaml overrides/{distro/random/{k-testing.yaml supported$/{rhel_7.yaml}} ms-die-on-skipped.yaml}} objectstore-ec/bluestore-ec-root.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_fsx.yaml} | 3 | |
Failure Reason:
Command failed on smithi077 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |
||||||||||||||
fail | 4201427 | 2019-08-08 23:37:50 | 2019-08-11 00:09:07 | 2019-08-11 00:27:06 | 0:17:59 | 0:10:44 | 0:07:15 | smithi | master | rhel | 7.6 | multimds/basic/{begin.yaml clusters/3-mds.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/kclient/{mount.yaml overrides/{distro/rhel/{k-distro.yaml rhel_7.yaml} ms-die-on-skipped.yaml}} objectstore-ec/filestore-xfs.yaml overrides/{basic/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} fuse-default-perm-no.yaml} q_check_counter/check_counter.yaml tasks/cfuse_workunit_suites_pjd.yaml} | 3 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-cloud ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-ssh ceph-fuse libcephfs2 libcephfs-devel librados2 librbd1 python-ceph rbd-fuse python36-cephfs bison flex elfutils-libelf-devel openssl-devel bison flex elfutils-libelf-devel openssl-devel' |