User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail | Dead | Unknown |
---|---|---|---|---|---|---|---|---|---|---|---|---|
kchai | 2022-08-27 04:01:16 | 2022-08-27 04:04:47 | 2022-08-27 17:13:26 | 13:08:39 | rados | wip-kefu-testing-2022-08-27-0813 | smithi | 4ce1c79 | 11 | 111 | 12 | 1 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 6995422 | 2022-08-27 04:02:25 | 2022-08-27 04:04:46 | 2022-08-27 04:17:50 | 0:13:04 | 0:07:01 | 0:06:03 | smithi | main | centos | 8.stream | rados/singleton-nomsgr/{all/cache-fs-trunc mon_election/connectivity rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi152 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995423 | 2022-08-27 04:02:26 | 2022-08-27 04:04:47 | 2022-08-27 04:19:01 | 0:14:14 | 0:09:06 | 0:05:08 | smithi | main | rhel | 8.6 | rados/standalone/{supported-random-distro$/{rhel_8} workloads/mon} | 1 | |
Failure Reason:
Command failed on smithi036 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995424 | 2022-08-27 04:02:27 | 2022-08-27 04:04:47 | 2022-08-27 04:19:33 | 0:14:46 | 0:08:47 | 0:05:59 | smithi | main | rhel | 8.6 | rados/singleton/{all/mon-config-key-caps mon_election/classic msgr-failures/none msgr/async objectstore/bluestore-comp-snappy rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi143 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995425 | 2022-08-27 04:02:28 | 2022-08-27 04:04:47 | 2022-08-27 04:22:30 | 0:17:43 | 0:07:20 | 0:10:23 | smithi | main | ubuntu | 20.04 | rados/monthrash/{ceph clusters/3-mons mon_election/connectivity msgr-failures/mon-delay msgr/async objectstore/bluestore-stupid rados supported-random-distro$/{ubuntu_latest} thrashers/many workloads/rados_5925} | 2 | |
Failure Reason:
Command failed on smithi050 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995426 | 2022-08-27 04:02:29 | 2022-08-27 04:05:18 | 2022-08-27 04:24:31 | 0:19:13 | 0:06:43 | 0:12:30 | smithi | main | ubuntu | 20.04 | rados/perf/{ceph mon_election/classic objectstore/bluestore-stupid openstack scheduler/dmclock_1Shard_16Threads settings/optimized ubuntu_latest workloads/sample_fio} | 1 | |
Failure Reason:
Command failed on smithi139 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995427 | 2022-08-27 04:02:31 | 2022-08-27 04:07:08 | 2022-08-27 04:28:58 | 0:21:50 | 0:07:28 | 0:14:22 | smithi | main | ubuntu | 20.04 | rados/thrash/{0-size-min-size-overrides/2-size-2-min-size 1-pg-log-overrides/normal_pg_log 2-recovery-overrides/{more-async-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-2} backoff/peering_and_degraded ceph clusters/{fixed-2 openstack} crc-failures/bad_map_crc_failure d-balancer/crush-compat mon_election/classic msgr-failures/few msgr/async-v2only objectstore/bluestore-comp-zlib rados supported-random-distro$/{ubuntu_latest} thrashers/none thrashosds-health workloads/radosbench-high-concurrency} | 2 | |
Failure Reason:
Command failed on smithi114 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995428 | 2022-08-27 04:02:32 | 2022-08-27 04:11:09 | 2022-08-27 04:27:39 | 0:16:30 | 0:08:06 | 0:08:24 | smithi | main | rhel | 8.6 | rados/basic/{ceph clusters/{fixed-2 openstack} mon_election/classic msgr-failures/many msgr/async objectstore/bluestore-stupid rados supported-random-distro$/{rhel_8} tasks/libcephsqlite} | 2 | |
Failure Reason:
Command failed on smithi105 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd sqlite-devel sqlite-devel sqlite-devel sqlite-devel' |
||||||||||||||
fail | 6995429 | 2022-08-27 04:02:33 | 2022-08-27 04:12:40 | 2022-08-27 04:28:14 | 0:15:34 | 0:07:08 | 0:08:26 | smithi | main | centos | 8.stream | rados/cephadm/workunits/{0-distro/centos_8.stream_container_tools_crun agent/on mon_election/classic task/test_adoption} | 1 | |
Failure Reason:
Command failed on smithi146 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995430 | 2022-08-27 04:02:34 | 2022-08-27 04:14:51 | 2022-08-27 04:32:54 | 0:18:03 | 0:06:50 | 0:11:13 | smithi | main | ubuntu | 20.04 | rados/singleton-nomsgr/{all/ceph-kvstore-tool mon_election/classic rados supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
Command failed on smithi066 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995431 | 2022-08-27 04:02:35 | 2022-08-27 04:14:51 | 2022-08-27 04:28:07 | 0:13:16 | 0:07:07 | 0:06:09 | smithi | main | centos | 8.stream | rados/thrash-erasure-code/{ceph clusters/{fixed-2 openstack} fast/normal mon_election/classic msgr-failures/fastclose objectstore/bluestore-comp-zlib rados recovery-overrides/{default} supported-random-distro$/{centos_8} thrashers/morepggrow thrashosds-health workloads/ec-radosbench} | 2 | |
Failure Reason:
Command failed on smithi085 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995432 | 2022-08-27 04:02:36 | 2022-08-27 04:14:51 | 2022-08-27 04:27:44 | 0:12:53 | 0:07:00 | 0:05:53 | smithi | main | centos | 8.stream | rados/singleton/{all/mon-config-keys mon_election/connectivity msgr-failures/few msgr/async-v1only objectstore/bluestore-comp-zlib rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi138 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995433 | 2022-08-27 04:02:38 | 2022-08-27 04:14:52 | 2022-08-27 04:32:29 | 0:17:37 | 0:07:35 | 0:10:02 | smithi | main | ubuntu | 20.04 | rados/thrash-erasure-code-overwrites/{bluestore-bitmap ceph clusters/{fixed-2 openstack} fast/fast mon_election/classic msgr-failures/fastclose rados recovery-overrides/{more-active-recovery} supported-random-distro$/{ubuntu_latest} thrashers/careful thrashosds-health workloads/ec-snaps-few-objects-overwrites} | 2 | |
Failure Reason:
Command failed on smithi137 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
pass | 6995434 | 2022-08-27 04:02:39 | 2022-08-27 04:15:12 | 2022-08-27 04:38:06 | 0:22:54 | 0:16:11 | 0:06:43 | smithi | main | centos | 8.stream | rados/cephadm/smoke/{0-distro/centos_8.stream_container_tools_crun 0-nvme-loop agent/on fixed-2 mon_election/connectivity start} | 2 | |
fail | 6995435 | 2022-08-27 04:02:40 | 2022-08-27 04:15:12 | 2022-08-27 04:30:28 | 0:15:16 | 0:08:47 | 0:06:29 | smithi | main | rhel | 8.6 | rados/thrash/{0-size-min-size-overrides/3-size-2-min-size 1-pg-log-overrides/short_pg_log 2-recovery-overrides/{more-active-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-3} backoff/normal ceph clusters/{fixed-2 openstack} crc-failures/default d-balancer/on mon_election/connectivity msgr-failures/osd-delay msgr/async objectstore/bluestore-comp-zstd rados supported-random-distro$/{rhel_8} thrashers/pggrow thrashosds-health workloads/radosbench} | 2 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995436 | 2022-08-27 04:02:41 | 2022-08-27 04:15:13 | 2022-08-27 04:35:10 | 0:19:57 | 0:07:15 | 0:12:42 | smithi | main | ubuntu | 20.04 | rados/mgr/{clusters/{2-node-mgr} debug/mgr mgr_ttl_cache/disable mon_election/connectivity random-objectstore$/{bluestore-bitmap} supported-random-distro$/{ubuntu_latest} tasks/failover} | 2 | |
Failure Reason:
Command failed on smithi157 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995437 | 2022-08-27 04:02:42 | 2022-08-27 04:17:13 | 2022-08-27 04:33:12 | 0:15:59 | 0:07:21 | 0:08:38 | smithi | main | centos | 8.stream | rados/thrash-erasure-code-shec/{ceph clusters/{fixed-4 openstack} mon_election/classic msgr-failures/osd-delay objectstore/bluestore-comp-zstd rados recovery-overrides/{default} supported-random-distro$/{centos_8} thrashers/default thrashosds-health workloads/ec-rados-plugin=shec-k=4-m=3-c=2} | 4 | |
Failure Reason:
Command failed on smithi130 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
pass | 6995438 | 2022-08-27 04:02:43 | 2022-08-27 04:18:14 | 2022-08-27 04:35:52 | 0:17:38 | 0:07:15 | 0:10:23 | smithi | main | ubuntu | 20.04 | rados/singleton-nomsgr/{all/ceph-post-file mon_election/connectivity rados supported-random-distro$/{ubuntu_latest}} | 1 | |
fail | 6995439 | 2022-08-27 04:02:45 | 2022-08-27 04:19:04 | 2022-08-27 04:32:26 | 0:13:22 | 0:07:10 | 0:06:12 | smithi | main | centos | 8.stream | rados/verify/{centos_latest ceph clusters/{fixed-2 openstack} d-thrash/default/{default thrashosds-health} mon_election/classic msgr-failures/few msgr/async-v1only objectstore/bluestore-comp-zstd rados tasks/mon_recovery validater/lockdep} | 2 | |
Failure Reason:
Command failed on smithi143 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995440 | 2022-08-27 04:02:46 | 2022-08-27 04:19:35 | 2022-08-27 04:34:53 | 0:15:18 | 0:07:11 | 0:08:07 | smithi | main | centos | 8.stream | rados/singleton/{all/mon-config mon_election/classic msgr-failures/many msgr/async-v2only objectstore/bluestore-comp-zstd rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi039 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995441 | 2022-08-27 04:02:47 | 2022-08-27 04:21:15 | 2022-08-27 04:36:45 | 0:15:30 | 0:08:45 | 0:06:45 | smithi | main | rhel | 8.6 | rados/singleton-bluestore/{all/cephtool mon_election/connectivity msgr-failures/none msgr/async-v1only objectstore/bluestore-bitmap rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi171 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995442 | 2022-08-27 04:02:48 | 2022-08-27 04:21:36 | 2022-08-27 04:37:18 | 0:15:42 | 0:09:20 | 0:06:22 | smithi | main | rhel | 8.6 | rados/cephadm/workunits/{0-distro/rhel_8.6_container_tools_3.0 agent/off mon_election/connectivity task/test_cephadm} | 1 | |
Failure Reason:
Command failed on smithi073 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995443 | 2022-08-27 04:02:49 | 2022-08-27 04:22:36 | 2022-08-27 04:39:20 | 0:16:44 | 0:07:05 | 0:09:39 | smithi | main | centos | 8.stream | rados/thrash-erasure-code-big/{ceph cluster/{12-osds openstack} mon_election/connectivity msgr-failures/few objectstore/bluestore-comp-zlib rados recovery-overrides/{more-async-partial-recovery} supported-random-distro$/{centos_8} thrashers/careful thrashosds-health workloads/ec-rados-plugin=lrc-k=4-m=2-l=3} | 3 | |
Failure Reason:
Command failed on smithi139 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995444 | 2022-08-27 04:02:50 | 2022-08-27 04:25:47 | 2022-08-27 04:42:48 | 0:17:01 | 0:08:48 | 0:08:13 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code-isa/{arch/x86_64 ceph clusters/{fixed-2 openstack} mon_election/connectivity msgr-failures/few objectstore/bluestore-comp-zlib rados recovery-overrides/{more-async-recovery} supported-random-distro$/{rhel_8} thrashers/careful thrashosds-health workloads/ec-rados-plugin=isa-k=2-m=1} | 2 | |
Failure Reason:
Command failed on smithi138 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995445 | 2022-08-27 04:02:52 | 2022-08-27 04:27:48 | 2022-08-27 04:43:24 | 0:15:36 | 0:07:22 | 0:08:14 | smithi | main | centos | 8.stream | rados/thrash-old-clients/{0-distro$/{centos_8.stream_container_tools} 0-size-min-size-overrides/3-size-2-min-size 1-install/pacific backoff/peering ceph clusters/{openstack three-plus-one} d-balancer/crush-compat mon_election/connectivity msgr-failures/osd-delay rados thrashers/mapgap thrashosds-health workloads/test_rbd_api} | 3 | |
Failure Reason:
Command failed on smithi085 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995446 | 2022-08-27 04:02:53 | 2022-08-27 04:28:08 | 2022-08-27 04:42:18 | 0:14:10 | 0:07:10 | 0:07:00 | smithi | main | centos | 8.stream | rados/multimon/{clusters/9 mon_election/classic msgr-failures/few msgr/async-v2only no_pools objectstore/bluestore-comp-zlib rados supported-random-distro$/{centos_8} tasks/mon_clock_with_skews} | 3 | |
Failure Reason:
Command failed on smithi114 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995447 | 2022-08-27 04:02:54 | 2022-08-27 04:29:09 | 2022-08-27 04:41:31 | 0:12:22 | 0:07:14 | 0:05:08 | smithi | main | centos | 8.stream | rados/objectstore/{backends/objectcacher-stress supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi050 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995448 | 2022-08-27 04:02:55 | 2022-08-27 04:29:09 | 2022-08-27 04:47:38 | 0:18:29 | 0:06:56 | 0:11:33 | smithi | main | ubuntu | 20.04 | rados/perf/{ceph mon_election/connectivity objectstore/bluestore-basic-min-osd-mem-target openstack scheduler/dmclock_default_shards settings/optimized ubuntu_latest workloads/sample_radosbench} | 1 | |
Failure Reason:
Command failed on smithi179 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995449 | 2022-08-27 04:02:56 | 2022-08-27 04:30:10 | 2022-08-27 04:43:48 | 0:13:38 | 0:07:05 | 0:06:33 | smithi | main | centos | 8.stream | rados/singleton-nomsgr/{all/crushdiff mon_election/classic rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi163 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995450 | 2022-08-27 04:02:57 | 2022-08-27 04:30:10 | 2022-08-27 04:43:46 | 0:13:36 | 0:07:02 | 0:06:34 | smithi | main | centos | 8.stream | rados/thrash/{0-size-min-size-overrides/2-size-2-min-size 1-pg-log-overrides/normal_pg_log 2-recovery-overrides/{more-partial-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-3} backoff/peering ceph clusters/{fixed-2 openstack} crc-failures/bad_map_crc_failure d-balancer/crush-compat mon_election/classic msgr-failures/osd-dispatch-delay msgr/async-v1only objectstore/bluestore-hybrid rados supported-random-distro$/{centos_8} thrashers/careful thrashosds-health workloads/redirect} | 2 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995451 | 2022-08-27 04:02:58 | 2022-08-27 04:30:30 | 2022-08-27 04:45:24 | 0:14:54 | 0:07:05 | 0:07:49 | smithi | main | centos | 8.stream | rados/singleton/{all/osd-backfill mon_election/connectivity msgr-failures/none msgr/async objectstore/bluestore-hybrid rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi196 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
pass | 6995452 | 2022-08-27 04:03:00 | 2022-08-27 04:32:31 | 2022-08-27 04:51:23 | 0:18:52 | 0:13:16 | 0:05:36 | smithi | main | centos | 8.stream | rados/cephadm/osds/{0-distro/centos_8.stream_container_tools_crun 0-nvme-loop 1-start 2-ops/repave-all} | 2 | |
fail | 6995453 | 2022-08-27 04:03:01 | 2022-08-27 04:32:31 | 2022-08-27 04:51:03 | 0:18:32 | 0:07:11 | 0:11:21 | smithi | main | ubuntu | 20.04 | rados/basic/{ceph clusters/{fixed-2 openstack} mon_election/connectivity msgr-failures/few msgr/async-v1only objectstore/filestore-xfs rados supported-random-distro$/{ubuntu_latest} tasks/rados_api_tests} | 2 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
pass | 6995454 | 2022-08-27 04:03:02 | 2022-08-27 04:33:02 | 2022-08-27 04:50:19 | 0:17:17 | 0:10:00 | 0:07:17 | smithi | main | rhel | 8.6 | rados/cephadm/workunits/{0-distro/rhel_8.6_container_tools_rhel8 agent/on mon_election/classic task/test_cephadm_repos} | 1 | |
fail | 6995455 | 2022-08-27 04:03:03 | 2022-08-27 04:33:22 | 2022-08-27 04:46:04 | 0:12:42 | 0:07:00 | 0:05:42 | smithi | main | centos | 8.stream | rados/singleton/{all/osd-recovery-incomplete mon_election/classic msgr-failures/few msgr/async-v1only objectstore/bluestore-low-osd-mem-target rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi130 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995456 | 2022-08-27 04:03:04 | 2022-08-27 04:33:23 | 2022-08-27 04:48:16 | 0:14:53 | 0:09:07 | 0:05:46 | smithi | main | rhel | 8.6 | rados/singleton-nomsgr/{all/export-after-evict mon_election/connectivity rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi061 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995457 | 2022-08-27 04:03:05 | 2022-08-27 04:33:23 | 2022-08-27 05:52:17 | 1:18:54 | 1:09:57 | 0:08:57 | smithi | main | ubuntu | 20.04 | rados/standalone/{supported-random-distro$/{ubuntu_latest} workloads/osd-backfill} | 1 | |
Failure Reason:
SSH connection to smithi017 was lost: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=4ce1c79d5c812c812f8b11a662e5f9f6e351ea9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/osd-backfill/osd-backfill-space.sh' |
||||||||||||||
fail | 6995458 | 2022-08-27 04:03:07 | 2022-08-27 04:33:23 | 2022-08-27 04:48:39 | 0:15:16 | 0:07:13 | 0:08:03 | smithi | main | centos | 8.stream | rados/valgrind-leaks/{1-start 2-inject-leak/osd centos_latest} | 1 | |
Failure Reason:
Command failed on smithi039 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995459 | 2022-08-27 04:03:08 | 2022-08-27 04:35:04 | 2022-08-27 04:53:12 | 0:18:08 | 0:07:07 | 0:11:01 | smithi | main | ubuntu | 20.04 | rados/monthrash/{ceph clusters/9-mons mon_election/classic msgr-failures/few msgr/async-v1only objectstore/filestore-xfs rados supported-random-distro$/{ubuntu_latest} thrashers/one workloads/rados_api_tests} | 2 | |
Failure Reason:
Command failed on smithi157 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995460 | 2022-08-27 04:03:09 | 2022-08-27 04:35:14 | 2022-08-27 04:53:08 | 0:17:54 | 0:07:41 | 0:10:13 | smithi | main | ubuntu | 20.04 | rados/thrash/{0-size-min-size-overrides/3-size-2-min-size 1-pg-log-overrides/short_pg_log 2-recovery-overrides/{more-partial-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-2} backoff/peering_and_degraded ceph clusters/{fixed-2 openstack} crc-failures/default d-balancer/on mon_election/connectivity msgr-failures/fastclose msgr/async-v2only objectstore/bluestore-low-osd-mem-target rados supported-random-distro$/{ubuntu_latest} thrashers/default thrashosds-health workloads/redirect_promote_tests} | 2 | |
Failure Reason:
Command failed on smithi036 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
pass | 6995461 | 2022-08-27 04:03:10 | 2022-08-27 04:35:55 | 2022-08-27 05:02:32 | 0:26:37 | 0:19:24 | 0:07:13 | smithi | main | rhel | 8.6 | rados/cephadm/smoke/{0-distro/rhel_8.6_container_tools_3.0 0-nvme-loop agent/off fixed-2 mon_election/classic start} | 2 | |
fail | 6995462 | 2022-08-27 04:03:12 | 2022-08-27 04:37:25 | 2022-08-27 04:55:41 | 0:18:16 | 0:07:22 | 0:10:54 | smithi | main | ubuntu | 20.04 | rados/thrash-erasure-code/{ceph clusters/{fixed-2 openstack} fast/fast mon_election/connectivity msgr-failures/few objectstore/bluestore-comp-zstd rados recovery-overrides/{more-async-partial-recovery} supported-random-distro$/{ubuntu_latest} thrashers/pggrow thrashosds-health workloads/ec-small-objects-balanced} | 2 | |
Failure Reason:
Command failed on smithi097 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995463 | 2022-08-27 04:03:13 | 2022-08-27 04:38:16 | 2022-08-27 04:54:17 | 0:16:01 | 0:08:47 | 0:07:14 | smithi | main | rhel | 8.6 | rados/singleton/{all/osd-recovery mon_election/connectivity msgr-failures/many msgr/async-v2only objectstore/bluestore-stupid rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi139 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995464 | 2022-08-27 04:03:14 | 2022-08-27 04:39:26 | 2022-08-27 04:54:16 | 0:14:50 | 0:08:42 | 0:06:08 | smithi | main | rhel | 8.6 | rados/singleton-nomsgr/{all/full-tiering mon_election/classic rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995465 | 2022-08-27 04:03:15 | 2022-08-27 04:39:27 | 2022-08-27 04:57:10 | 0:17:43 | 0:09:09 | 0:08:34 | smithi | main | rhel | 8.6 | rados/thrash/{0-size-min-size-overrides/2-size-2-min-size 1-pg-log-overrides/normal_pg_log 2-recovery-overrides/{more-partial-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-3} backoff/normal ceph clusters/{fixed-2 openstack} crc-failures/bad_map_crc_failure d-balancer/crush-compat mon_election/classic msgr-failures/few msgr/async objectstore/bluestore-stupid rados supported-random-distro$/{rhel_8} thrashers/mapgap thrashosds-health workloads/redirect_set_object} | 2 | |
Failure Reason:
Command failed on smithi050 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995466 | 2022-08-27 04:03:17 | 2022-08-27 04:41:37 | 2022-08-27 04:58:10 | 0:16:33 | 0:09:01 | 0:07:32 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code-shec/{ceph clusters/{fixed-4 openstack} mon_election/connectivity msgr-failures/osd-dispatch-delay objectstore/bluestore-hybrid rados recovery-overrides/{more-active-recovery} supported-random-distro$/{rhel_8} thrashers/careful thrashosds-health workloads/ec-rados-plugin=shec-k=4-m=3-c=2} | 4 | |
Failure Reason:
Command failed on smithi105 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
pass | 6995467 | 2022-08-27 04:03:18 | 2022-08-27 04:42:58 | 2022-08-27 05:06:57 | 0:23:59 | 0:17:08 | 0:06:51 | smithi | main | rhel | 8.6 | rados/cephadm/osds/{0-distro/rhel_8.6_container_tools_3.0 0-nvme-loop 1-start 2-ops/rm-zap-add} | 2 | |
fail | 6995468 | 2022-08-27 04:03:19 | 2022-08-27 04:43:29 | 2022-08-27 05:01:24 | 0:17:55 | 0:06:48 | 0:11:07 | smithi | main | ubuntu | 20.04 | rados/perf/{ceph mon_election/classic objectstore/bluestore-bitmap openstack scheduler/wpq_default_shards settings/optimized ubuntu_latest workloads/fio_4K_rand_read} | 1 | |
Failure Reason:
Command failed on smithi027 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995469 | 2022-08-27 04:03:20 | 2022-08-27 04:43:29 | 2022-08-27 04:57:20 | 0:13:51 | 0:07:02 | 0:06:49 | smithi | main | centos | 8.stream | rados/verify/{centos_latest ceph clusters/{fixed-2 openstack} d-thrash/none mon_election/connectivity msgr-failures/few msgr/async-v2only objectstore/bluestore-hybrid rados tasks/rados_api_tests validater/valgrind} | 2 | |
Failure Reason:
Command failed on smithi163 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995470 | 2022-08-27 04:03:21 | 2022-08-27 04:43:49 | 2022-08-27 04:57:04 | 0:13:15 | 0:07:05 | 0:06:10 | smithi | main | centos | 8.stream | rados/mgr/{clusters/{2-node-mgr} debug/mgr mgr_ttl_cache/enable mon_election/classic random-objectstore$/{bluestore-hybrid} supported-random-distro$/{centos_8} tasks/insights} | 2 | |
Failure Reason:
Command failed on smithi088 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995471 | 2022-08-27 04:03:23 | 2022-08-27 04:43:50 | 2022-08-27 04:58:11 | 0:14:21 | 0:06:58 | 0:07:23 | smithi | main | centos | 8.stream | rados/singleton/{all/peer mon_election/classic msgr-failures/none msgr/async objectstore/filestore-xfs rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi196 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995472 | 2022-08-27 04:03:24 | 2022-08-27 04:45:30 | 2022-08-27 05:01:23 | 0:15:53 | 0:06:17 | 0:09:36 | smithi | main | ubuntu | 20.04 | rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/radosbench cluster/1-node k8s/1.21 net/flannel rook/master} | 1 | |
Failure Reason:
Command failed on smithi130 with status 1: 'kubectl create -f rook/deploy/examples/crds.yaml -f rook/deploy/examples/common.yaml -f operator.yaml' |
||||||||||||||
fail | 6995473 | 2022-08-27 04:03:25 | 2022-08-27 04:46:11 | 2022-08-27 05:05:04 | 0:18:53 | 0:06:59 | 0:11:54 | smithi | main | ubuntu | 20.04 | rados/singleton-nomsgr/{all/health-warnings mon_election/connectivity rados supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
Command failed on smithi179 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995474 | 2022-08-27 04:03:26 | 2022-08-27 04:47:41 | 2022-08-27 05:08:17 | 0:20:36 | 0:08:08 | 0:12:28 | smithi | main | ubuntu | 20.04 | rados/thrash-erasure-code-big/{ceph cluster/{12-osds openstack} mon_election/classic msgr-failures/osd-delay objectstore/bluestore-comp-zstd rados recovery-overrides/{more-async-partial-recovery} supported-random-distro$/{ubuntu_latest} thrashers/default thrashosds-health workloads/ec-rados-plugin=jerasure-k=4-m=2} | 3 | |
Failure Reason:
Command failed on smithi039 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995475 | 2022-08-27 04:03:28 | 2022-08-27 04:50:22 | 2022-08-27 05:09:13 | 0:18:51 | 0:07:29 | 0:11:22 | smithi | main | ubuntu | 20.04 | rados/thrash-erasure-code-isa/{arch/x86_64 ceph clusters/{fixed-2 openstack} mon_election/classic msgr-failures/osd-delay objectstore/bluestore-comp-zstd rados recovery-overrides/{more-async-partial-recovery} supported-random-distro$/{ubuntu_latest} thrashers/default thrashosds-health workloads/ec-rados-plugin=isa-k=2-m=1} | 2 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995476 | 2022-08-27 04:03:29 | 2022-08-27 04:51:13 | 2022-08-27 05:51:09 | 0:59:56 | 0:50:29 | 0:09:27 | smithi | main | ubuntu | 20.04 | rados/objectstore/{backends/objectstore-bluestore-a supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
SSH connection to smithi143 was lost: 'sudo TESTDIR=/home/ubuntu/cephtest bash -c \'mkdir $TESTDIR/archive/ostest && cd $TESTDIR/archive/ostest && ulimit -Sn 16384 && CEPH_ARGS="--no-log-to-stderr --log-file $TESTDIR/archive/ceph_test_objectstore.log --debug-bluestore 20" ceph_test_objectstore --gtest_filter=*/2:-*SyntheticMatrixC* --gtest_catch_exceptions=0\'' |
||||||||||||||
fail | 6995477 | 2022-08-27 04:03:30 | 2022-08-27 05:10:53 | 764 | smithi | main | rhel | 8.6 | rados/basic/{ceph clusters/{fixed-2 openstack} mon_election/classic msgr-failures/many msgr/async-v2only objectstore/bluestore-bitmap rados supported-random-distro$/{rhel_8} tasks/rados_cls_all} | 2 | ||||
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd sqlite-devel sqlite-devel sqlite-devel sqlite-devel' |
||||||||||||||
fail | 6995478 | 2022-08-27 04:03:31 | 2022-08-27 04:52:14 | 2022-08-27 05:05:04 | 0:12:50 | 0:07:08 | 0:05:42 | smithi | main | centos | 8.stream | rados/cephadm/workunits/{0-distro/ubuntu_20.04 agent/off mon_election/connectivity task/test_iscsi_pids_limit/{centos_8.stream_container_tools test_iscsi_pids_limit}} | 1 | |
Failure Reason:
Command failed on smithi137 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995479 | 2022-08-27 04:03:33 | 2022-08-27 04:52:14 | 2022-08-27 05:10:45 | 0:18:31 | 0:11:26 | 0:07:05 | smithi | main | rhel | 8.6 | rados/thrash/{0-size-min-size-overrides/3-size-2-min-size 1-pg-log-overrides/short_pg_log 2-recovery-overrides/{more-partial-recovery} 3-scrub-overrides/{default} backoff/peering ceph clusters/{fixed-2 openstack} crc-failures/default d-balancer/on mon_election/connectivity msgr-failures/osd-delay msgr/async-v1only objectstore/filestore-xfs rados supported-random-distro$/{rhel_8} thrashers/morepggrow thrashosds-health workloads/set-chunks-read} | 2 | |
Failure Reason:
Command failed on smithi157 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995480 | 2022-08-27 04:03:34 | 2022-08-27 04:53:14 | 2022-08-27 05:11:57 | 0:18:43 | 0:12:17 | 0:06:26 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code-overwrites/{bluestore-bitmap ceph clusters/{fixed-2 openstack} fast/normal mon_election/connectivity msgr-failures/few rados recovery-overrides/{more-partial-recovery} supported-random-distro$/{rhel_8} thrashers/default thrashosds-health workloads/ec-pool-snaps-few-objects-overwrites} | 2 | |
Failure Reason:
Command failed on smithi201 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995481 | 2022-08-27 04:03:35 | 2022-08-27 04:53:15 | 2022-08-27 05:10:37 | 0:17:22 | 0:07:16 | 0:10:06 | smithi | main | centos | 8.stream | rados/multimon/{clusters/21 mon_election/connectivity msgr-failures/many msgr/async no_pools objectstore/bluestore-comp-zstd rados supported-random-distro$/{centos_8} tasks/mon_recovery} | 3 | |
Failure Reason:
Command failed on smithi139 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995482 | 2022-08-27 04:03:36 | 2022-08-27 04:55:46 | 2022-08-27 05:14:52 | 0:19:06 | 0:07:42 | 0:11:24 | smithi | main | ubuntu | 20.04 | rados/singleton/{all/pg-autoscaler-progress-off mon_election/connectivity msgr-failures/few msgr/async-v1only objectstore/bluestore-bitmap rados supported-random-distro$/{ubuntu_latest}} | 2 | |
Failure Reason:
Command failed on smithi088 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
pass | 6995483 | 2022-08-27 04:03:38 | 2022-08-27 04:57:06 | 2022-08-27 05:22:42 | 0:25:36 | 0:19:24 | 0:06:12 | smithi | main | rhel | 8.6 | rados/cephadm/smoke/{0-distro/rhel_8.6_container_tools_rhel8 0-nvme-loop agent/on fixed-2 mon_election/connectivity start} | 2 | |
fail | 6995484 | 2022-08-27 04:03:39 | 2022-08-27 04:57:17 | 2022-08-27 05:12:57 | 0:15:40 | 0:07:19 | 0:08:21 | smithi | main | centos | 8.stream | rados/dashboard/{0-single-container-host debug/mgr mon_election/classic random-objectstore$/{bluestore-comp-zstd} tasks/e2e} | 2 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995485 | 2022-08-27 04:03:40 | 2022-08-27 04:57:27 | 2022-08-27 05:14:20 | 0:16:53 | 0:09:31 | 0:07:22 | smithi | main | rhel | 8.6 | rados/singleton-nomsgr/{all/large-omap-object-warnings mon_election/classic rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi125 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995486 | 2022-08-27 04:03:41 | 2022-08-27 04:57:27 | 2022-08-27 05:15:59 | 0:18:32 | 0:07:23 | 0:11:09 | smithi | main | ubuntu | 20.04 | rados/monthrash/{ceph clusters/3-mons mon_election/connectivity msgr-failures/mon-delay msgr/async-v2only objectstore/bluestore-bitmap rados supported-random-distro$/{ubuntu_latest} thrashers/sync-many workloads/rados_mon_osdmap_prune} | 2 | |
Failure Reason:
Command failed on smithi178 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995487 | 2022-08-27 04:03:42 | 2022-08-27 04:58:18 | 2022-08-27 05:15:25 | 0:17:07 | 0:09:23 | 0:07:44 | smithi | main | rhel | 8.6 | rados/thrash/{0-size-min-size-overrides/2-size-2-min-size 1-pg-log-overrides/normal_pg_log 2-recovery-overrides/{more-partial-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-2} backoff/peering_and_degraded ceph clusters/{fixed-2 openstack} crc-failures/bad_map_crc_failure d-balancer/crush-compat mon_election/classic msgr-failures/osd-dispatch-delay msgr/async-v2only objectstore/bluestore-bitmap rados supported-random-distro$/{rhel_8} thrashers/none thrashosds-health workloads/small-objects-balanced} | 2 | |
Failure Reason:
Command failed on smithi146 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995488 | 2022-08-27 04:03:44 | 2022-08-27 04:58:18 | 2022-08-27 05:15:53 | 0:17:35 | 0:07:21 | 0:10:14 | smithi | main | centos | 8.stream | rados/thrash-old-clients/{0-distro$/{centos_8.stream_container_tools} 0-size-min-size-overrides/2-size-2-min-size 1-install/nautilus-v1only backoff/peering_and_degraded ceph clusters/{openstack three-plus-one} d-balancer/on mon_election/classic msgr-failures/fastclose rados thrashers/morepggrow thrashosds-health workloads/cache-snaps} | 3 | |
Failure Reason:
Command failed on smithi105 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995489 | 2022-08-27 04:03:45 | 2022-08-27 05:00:59 | 2022-08-27 05:18:46 | 0:17:47 | 0:10:26 | 0:07:21 | smithi | main | rhel | 8.6 | rados/singleton/{all/pg-autoscaler mon_election/classic msgr-failures/many msgr/async-v2only objectstore/bluestore-comp-lz4 rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi027 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
pass | 6995490 | 2022-08-27 04:03:46 | 2022-08-27 05:01:30 | 2022-08-27 05:29:56 | 0:28:26 | 0:19:50 | 0:08:36 | smithi | main | rhel | 8.6 | rados/cephadm/osds/{0-distro/rhel_8.6_container_tools_rhel8 0-nvme-loop 1-start 2-ops/rm-zap-flag} | 2 | |
fail | 6995491 | 2022-08-27 04:03:47 | 2022-08-27 05:02:40 | 2022-08-27 05:17:46 | 0:15:06 | 0:06:45 | 0:08:21 | smithi | main | ubuntu | 20.04 | rados/perf/{ceph mon_election/connectivity objectstore/bluestore-comp openstack scheduler/dmclock_1Shard_16Threads settings/optimized ubuntu_latest workloads/fio_4K_rand_rw} | 1 | |
Failure Reason:
Command failed on smithi130 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995492 | 2022-08-27 04:03:48 | 2022-08-27 05:02:41 | 2022-08-27 05:18:11 | 0:15:30 | 0:07:02 | 0:08:28 | smithi | main | centos | 8.stream | rados/singleton-nomsgr/{all/lazy_omap_stats_output mon_election/connectivity rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi179 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
dead | 6995493 | 2022-08-27 04:03:49 | 2022-08-27 05:05:11 | 2022-08-27 17:13:26 | 12:08:15 | smithi | main | ubuntu | 20.04 | rados/standalone/{supported-random-distro$/{ubuntu_latest} workloads/osd} | 1 | |||
Failure Reason:
hit max job timeout |
||||||||||||||
dead | 6995494 | 2022-08-27 04:03:51 | 2022-08-27 05:05:12 | 2022-08-27 05:23:32 | 0:18:20 | 0:11:07 | 0:07:13 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code/{ceph clusters/{fixed-2 openstack} fast/normal mon_election/classic msgr-failures/osd-delay objectstore/bluestore-hybrid rados recovery-overrides/{default} supported-random-distro$/{rhel_8} thrashers/careful thrashosds-health workloads/ec-small-objects-fast-read} | 2 | |
Failure Reason:
{'smithi070.front.sepia.ceph.com': {'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}} |
||||||||||||||
dead | 6995495 | 2022-08-27 04:03:52 | 2022-08-27 05:06:32 | 2022-08-27 05:21:11 | 0:14:39 | 0:05:52 | 0:08:47 | smithi | main | rhel | 8.6 | rados/cephadm/smoke-singlehost/{0-random-distro$/{rhel_8.6_container_tools_3.0} 1-start 2-services/rgw 3-final} | 1 | |
Failure Reason:
{'smithi049.front.sepia.ceph.com': {'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}} |
||||||||||||||
fail | 6995496 | 2022-08-27 04:03:53 | 2022-08-27 05:06:33 | 2022-08-27 05:26:00 | 0:19:27 | 0:11:47 | 0:07:40 | smithi | main | rhel | 8.6 | rados/singleton/{all/pg-removal-interruption mon_election/connectivity msgr-failures/none msgr/async objectstore/bluestore-comp-snappy rados supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
Command failed on smithi085 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
dead | 6995497 | 2022-08-27 04:03:54 | 2022-08-27 05:07:03 | 2022-08-27 05:29:53 | 0:22:50 | 0:14:42 | 0:08:08 | smithi | main | rhel | 8.6 | rados/thrash/{0-size-min-size-overrides/3-size-2-min-size 1-pg-log-overrides/short_pg_log 2-recovery-overrides/{more-partial-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-2} backoff/normal ceph clusters/{fixed-2 openstack} crc-failures/default d-balancer/on mon_election/connectivity msgr-failures/fastclose msgr/async objectstore/bluestore-comp-lz4 rados supported-random-distro$/{rhel_8} thrashers/pggrow thrashosds-health workloads/small-objects-localized} | 2 | |
Failure Reason:
{'smithi039.front.sepia.ceph.com': {'attempts': 12, 'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}, 'smithi152.front.sepia.ceph.com': {'attempts': 12, 'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}} |
||||||||||||||
fail | 6995498 | 2022-08-27 04:03:55 | 2022-08-27 05:08:24 | 2022-08-27 05:29:26 | 0:21:02 | 0:08:24 | 0:12:38 | smithi | main | ubuntu | 20.04 | rados/thrash-erasure-code-shec/{ceph clusters/{fixed-4 openstack} mon_election/classic msgr-failures/fastclose objectstore/bluestore-low-osd-mem-target rados recovery-overrides/{default} supported-random-distro$/{ubuntu_latest} thrashers/default thrashosds-health workloads/ec-rados-plugin=shec-k=4-m=3-c=2} | 4 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
dead | 6995499 | 2022-08-27 04:03:56 | 2022-08-27 05:29:32 | 797 | smithi | main | rhel | 8.6 | rados/basic/{ceph clusters/{fixed-2 openstack} mon_election/connectivity msgr-failures/few msgr/async objectstore/bluestore-comp-lz4 rados supported-random-distro$/{rhel_8} tasks/rados_python} | 2 | ||||
Failure Reason:
{'smithi139.front.sepia.ceph.com': {'attempts': 12, 'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}, 'smithi104.front.sepia.ceph.com': {'attempts': 12, 'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}} |
||||||||||||||
fail | 6995500 | 2022-08-27 04:03:58 | 2022-08-27 05:10:45 | 2022-08-27 05:23:32 | 0:12:47 | 0:06:56 | 0:05:51 | smithi | main | centos | 8.stream | rados/singleton-nomsgr/{all/librados_hello_world mon_election/classic rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi097 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995501 | 2022-08-27 04:03:59 | 2022-08-27 05:10:45 | 2022-08-27 05:23:29 | 0:12:44 | 0:07:15 | 0:05:29 | smithi | main | centos | 8.stream | rados/verify/{centos_latest ceph clusters/{fixed-2 openstack} d-thrash/default/{default thrashosds-health} mon_election/classic msgr-failures/few msgr/async objectstore/bluestore-low-osd-mem-target rados tasks/rados_cls_all validater/lockdep} | 2 | |
Failure Reason:
Command failed on smithi003 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995502 | 2022-08-27 04:04:00 | 2022-08-27 05:10:56 | 2022-08-27 05:23:47 | 0:12:51 | 0:07:07 | 0:05:44 | smithi | main | centos | 8.stream | rados/cephadm/workunits/{0-distro/centos_8.stream_container_tools agent/on mon_election/classic task/test_nfs} | 1 | |
Failure Reason:
Command failed on smithi189 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
dead | 6995503 | 2022-08-27 04:04:01 | 2022-08-27 05:10:56 | 2022-08-27 05:30:29 | 0:19:33 | 0:13:20 | 0:06:13 | smithi | main | rhel | 8.6 | rados/objectstore/{backends/objectstore-bluestore-b supported-random-distro$/{rhel_8}} | 1 | |
Failure Reason:
{'smithi157.front.sepia.ceph.com': {'attempts': 12, 'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}} |
||||||||||||||
fail | 6995504 | 2022-08-27 04:04:02 | 2022-08-27 05:10:56 | 2022-08-27 05:29:11 | 0:18:15 | 0:06:48 | 0:11:27 | smithi | main | ubuntu | 20.04 | rados/singleton/{all/radostool mon_election/classic msgr-failures/few msgr/async-v1only objectstore/bluestore-comp-zlib rados supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
Command failed on smithi201 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995505 | 2022-08-27 04:04:03 | 2022-08-27 05:12:07 | 2022-08-27 05:26:40 | 0:14:33 | 0:07:06 | 0:07:27 | smithi | main | centos | 8.stream | rados/mgr/{clusters/{2-node-mgr} debug/mgr mgr_ttl_cache/disable mon_election/connectivity random-objectstore$/{bluestore-comp-zlib} supported-random-distro$/{centos_8} tasks/module_selftest} | 2 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995506 | 2022-08-27 04:04:05 | 2022-08-27 05:13:07 | 2022-08-27 05:32:41 | 0:19:34 | 0:07:55 | 0:11:39 | smithi | main | ubuntu | 20.04 | rados/thrash-erasure-code-big/{ceph cluster/{12-osds openstack} mon_election/connectivity msgr-failures/osd-dispatch-delay objectstore/bluestore-hybrid rados recovery-overrides/{more-active-recovery} supported-random-distro$/{ubuntu_latest} thrashers/fastread thrashosds-health workloads/ec-rados-plugin=lrc-k=4-m=2-l=3} | 3 | |
Failure Reason:
Command failed on smithi088 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995507 | 2022-08-27 04:04:06 | 2022-08-27 05:14:58 | 2022-08-27 05:28:33 | 0:13:35 | 0:07:08 | 0:06:27 | smithi | main | centos | 8.stream | rados/thrash-erasure-code-isa/{arch/x86_64 ceph clusters/{fixed-2 openstack} mon_election/connectivity msgr-failures/osd-dispatch-delay objectstore/bluestore-hybrid rados recovery-overrides/{more-active-recovery} supported-random-distro$/{centos_8} thrashers/mapgap thrashosds-health workloads/ec-rados-plugin=isa-k=2-m=1} | 2 | |
Failure Reason:
Command failed on smithi138 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
dead | 6995508 | 2022-08-27 04:04:07 | 2022-08-27 05:15:28 | 2022-08-27 05:32:53 | 0:17:25 | 0:10:15 | 0:07:10 | smithi | main | rhel | 8.6 | rados/thrash/{0-size-min-size-overrides/2-size-2-min-size 1-pg-log-overrides/normal_pg_log 2-recovery-overrides/{more-async-recovery} 3-scrub-overrides/{default} backoff/peering ceph clusters/{fixed-2 openstack} crc-failures/bad_map_crc_failure d-balancer/crush-compat mon_election/classic msgr-failures/few msgr/async-v1only objectstore/bluestore-comp-snappy rados supported-random-distro$/{rhel_8} thrashers/careful thrashosds-health workloads/small-objects} | 2 | |
Failure Reason:
{'smithi105.front.sepia.ceph.com': {'attempts': 12, 'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}, 'smithi079.front.sepia.ceph.com': {'attempts': 12, 'censored': "the output has been hidden due to the fact that 'no_log: true' was specified for this result", 'changed': True}} |
||||||||||||||
fail | 6995509 | 2022-08-27 04:04:08 | 2022-08-27 05:15:59 | 2022-08-27 05:31:10 | 0:15:11 | 0:06:39 | 0:08:32 | smithi | main | ubuntu | 20.04 | rados/singleton-nomsgr/{all/msgr mon_election/connectivity rados supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
Command crashed: 'sudo TESTDIR=/home/ubuntu/cephtest bash -c ceph_test_msgr' |
||||||||||||||
pass | 6995510 | 2022-08-27 04:04:09 | 2022-08-27 05:15:59 | 2022-08-27 05:33:51 | 0:17:52 | 0:07:30 | 0:10:22 | smithi | main | ubuntu | 20.04 | rados/multimon/{clusters/3 mon_election/classic msgr-failures/few msgr/async-v1only no_pools objectstore/bluestore-hybrid rados supported-random-distro$/{ubuntu_latest} tasks/mon_clock_no_skews} | 2 | |
pass | 6995511 | 2022-08-27 04:04:11 | 2022-08-27 05:16:10 | 2022-08-27 05:51:01 | 0:34:51 | 0:24:25 | 0:10:26 | smithi | main | ubuntu | 20.04 | rados/cephadm/smoke/{0-distro/ubuntu_20.04 0-nvme-loop agent/off fixed-2 mon_election/classic start} | 2 | |
fail | 6995512 | 2022-08-27 04:04:12 | 2022-08-27 05:17:50 | 2022-08-27 05:36:48 | 0:18:58 | 0:07:09 | 0:11:49 | smithi | main | ubuntu | 20.04 | rados/singleton/{all/random-eio mon_election/connectivity msgr-failures/many msgr/async-v2only objectstore/bluestore-comp-zstd rados supported-random-distro$/{ubuntu_latest}} | 2 | |
Failure Reason:
Command failed on smithi027 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995513 | 2022-08-27 04:04:13 | 2022-08-27 05:18:51 | 2022-08-27 05:40:16 | 0:21:25 | 0:07:03 | 0:14:22 | smithi | main | ubuntu | 20.04 | rados/perf/{ceph mon_election/classic objectstore/bluestore-low-osd-mem-target openstack scheduler/dmclock_default_shards settings/optimized ubuntu_latest workloads/fio_4M_rand_read} | 1 | |
Failure Reason:
Command failed on smithi049 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
pass | 6995514 | 2022-08-27 04:04:14 | 2022-08-27 05:21:22 | 2022-08-27 05:51:08 | 0:29:46 | 0:17:21 | 0:12:25 | smithi | main | ubuntu | 20.04 | rados/cephadm/osds/{0-distro/ubuntu_20.04 0-nvme-loop 1-start 2-ops/rm-zap-wait} | 2 | |
fail | 6995515 | 2022-08-27 04:04:15 | 2022-08-27 05:22:52 | 2022-08-27 05:40:46 | 0:17:54 | 0:07:04 | 0:10:50 | smithi | main | ubuntu | 20.04 | rados/singleton-bluestore/{all/cephtool mon_election/classic msgr-failures/few msgr/async-v2only objectstore/bluestore-comp-lz4 rados supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
Command failed on smithi190 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995516 | 2022-08-27 04:04:16 | 2022-08-27 05:23:33 | 2022-08-27 05:40:50 | 0:17:17 | 0:07:07 | 0:10:10 | smithi | main | ubuntu | 20.04 | rados/singleton-nomsgr/{all/multi-backfill-reject mon_election/classic rados supported-random-distro$/{ubuntu_latest}} | 2 | |
Failure Reason:
Command failed on smithi070 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995517 | 2022-08-27 04:04:18 | 2022-08-27 05:23:33 | 2022-08-27 05:38:02 | 0:14:29 | 0:07:19 | 0:07:10 | smithi | main | centos | 8.stream | rados/thrash/{0-size-min-size-overrides/3-size-2-min-size 1-pg-log-overrides/short_pg_log 2-recovery-overrides/{more-async-partial-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-2} backoff/peering_and_degraded ceph clusters/{fixed-2 openstack} crc-failures/default d-balancer/on mon_election/connectivity msgr-failures/osd-delay msgr/async-v2only objectstore/bluestore-comp-zlib rados supported-random-distro$/{centos_8} thrashers/default thrashosds-health workloads/snaps-few-objects-balanced} | 2 | |
Failure Reason:
Command failed on smithi044 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995518 | 2022-08-27 04:04:19 | 2022-08-27 05:23:33 | 2022-08-27 05:43:24 | 0:19:51 | 0:07:44 | 0:12:07 | smithi | main | ubuntu | 20.04 | rados/monthrash/{ceph clusters/9-mons mon_election/classic msgr-failures/few msgr/async objectstore/bluestore-comp-lz4 rados supported-random-distro$/{ubuntu_latest} thrashers/sync workloads/rados_mon_workunits} | 2 | |
Failure Reason:
Command failed on smithi085 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995519 | 2022-08-27 04:04:20 | 2022-08-27 05:26:04 | 2022-08-27 05:40:17 | 0:14:13 | 0:07:10 | 0:07:03 | smithi | main | centos | 8.stream | rados/singleton/{all/rebuild-mondb mon_election/classic msgr-failures/none msgr/async objectstore/bluestore-hybrid rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi163 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995520 | 2022-08-27 04:04:21 | 2022-08-27 05:26:45 | 2022-08-27 05:41:44 | 0:14:59 | 0:07:17 | 0:07:42 | smithi | main | centos | 8.stream | rados/thrash-erasure-code-overwrites/{bluestore-bitmap ceph clusters/{fixed-2 openstack} fast/fast mon_election/classic msgr-failures/osd-delay rados recovery-overrides/{more-async-partial-recovery} supported-random-distro$/{centos_8} thrashers/fastread thrashosds-health workloads/ec-small-objects-fast-read-overwrites} | 2 | |
Failure Reason:
Command failed on smithi107 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995521 | 2022-08-27 04:04:22 | 2022-08-27 05:26:45 | 2022-08-27 05:39:40 | 0:12:55 | 0:07:15 | 0:05:40 | smithi | main | centos | 8.stream | rados/cephadm/workunits/{0-distro/centos_8.stream_container_tools_crun agent/off mon_election/connectivity task/test_orch_cli} | 1 | |
Failure Reason:
Command failed on smithi081 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995522 | 2022-08-27 04:04:24 | 2022-08-27 05:26:46 | 2022-08-27 05:46:11 | 0:19:25 | 0:07:47 | 0:11:38 | smithi | main | ubuntu | 20.04 | rados/basic/{ceph clusters/{fixed-2 openstack} mon_election/classic msgr-failures/many msgr/async-v1only objectstore/bluestore-comp-snappy rados supported-random-distro$/{ubuntu_latest} tasks/rados_stress_watch} | 2 | |
Failure Reason:
Command failed on smithi138 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995523 | 2022-08-27 04:04:25 | 2022-08-27 05:28:36 | 2022-08-27 05:45:00 | 0:16:24 | 0:09:11 | 0:07:13 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code/{ceph clusters/{fixed-2 openstack} fast/fast mon_election/connectivity msgr-failures/osd-dispatch-delay objectstore/bluestore-low-osd-mem-target rados recovery-overrides/{default} supported-random-distro$/{rhel_8} thrashers/default thrashosds-health workloads/ec-small-objects-many-deletes} | 2 | |
Failure Reason:
Command failed on smithi080 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995524 | 2022-08-27 04:04:26 | 2022-08-27 05:29:27 | 2022-08-27 05:46:42 | 0:17:15 | 0:07:12 | 0:10:03 | smithi | main | ubuntu | 20.04 | rados/singleton-nomsgr/{all/osd_stale_reads mon_election/connectivity rados supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
Command failed on smithi061 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995525 | 2022-08-27 04:04:27 | 2022-08-27 05:29:27 | 2022-08-27 05:43:36 | 0:14:09 | 0:09:16 | 0:04:53 | smithi | main | rhel | 8.6 | rados/standalone/{supported-random-distro$/{rhel_8} workloads/scrub} | 1 | |
Failure Reason:
Command failed on smithi037 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995526 | 2022-08-27 04:04:28 | 2022-08-27 05:29:27 | 2022-08-27 05:46:37 | 0:17:10 | 0:07:33 | 0:09:37 | smithi | main | ubuntu | 20.04 | rados/singleton/{all/recovery-preemption mon_election/connectivity msgr-failures/few msgr/async-v1only objectstore/bluestore-low-osd-mem-target rados supported-random-distro$/{ubuntu_latest}} | 1 | |
Failure Reason:
Command failed on smithi201 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995527 | 2022-08-27 04:04:29 | 2022-08-27 05:29:28 | 2022-08-27 05:46:57 | 0:17:29 | 0:07:51 | 0:09:38 | smithi | main | ubuntu | 20.04 | rados/thrash/{0-size-min-size-overrides/2-size-2-min-size 1-pg-log-overrides/normal_pg_log 2-recovery-overrides/{more-partial-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-3} backoff/normal ceph clusters/{fixed-2 openstack} crc-failures/bad_map_crc_failure d-balancer/crush-compat mon_election/classic msgr-failures/osd-dispatch-delay msgr/async objectstore/bluestore-comp-zstd rados supported-random-distro$/{ubuntu_latest} thrashers/mapgap thrashosds-health workloads/snaps-few-objects-localized} | 2 | |
Failure Reason:
Command failed on smithi104 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995528 | 2022-08-27 04:04:31 | 2022-08-27 05:29:38 | 2022-08-27 05:45:37 | 0:15:59 | 0:09:17 | 0:06:42 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code-shec/{ceph clusters/{fixed-4 openstack} mon_election/connectivity msgr-failures/few objectstore/bluestore-stupid rados recovery-overrides/{more-active-recovery} supported-random-distro$/{rhel_8} thrashers/careful thrashosds-health workloads/ec-rados-plugin=shec-k=4-m=3-c=2} | 4 | |
Failure Reason:
Command failed on smithi152 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995529 | 2022-08-27 04:04:32 | 2022-08-27 05:29:58 | 2022-08-27 05:52:49 | 0:22:51 | 0:13:29 | 0:09:22 | smithi | main | centos | 8.stream | rados/cephadm/smoke/{0-distro/centos_8.stream_container_tools 0-nvme-loop agent/on fixed-2 mon_election/connectivity start} | 2 | |
Failure Reason:
SSH connection to smithi057 was lost: 'sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 41100c5e-25cb-11ed-8431-001a4aab830c --force' |
||||||||||||||
fail | 6995530 | 2022-08-27 04:04:33 | 2022-08-27 05:31:19 | 2022-08-27 05:47:44 | 0:16:25 | 0:07:26 | 0:08:59 | smithi | main | centos | 8.stream | rados/verify/{centos_latest ceph clusters/{fixed-2 openstack} d-thrash/none mon_election/connectivity msgr-failures/few msgr/async-v1only objectstore/bluestore-stupid rados tasks/mon_recovery validater/valgrind} | 2 | |
Failure Reason:
Command failed on smithi114 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995531 | 2022-08-27 04:04:34 | 2022-08-27 05:32:50 | 2022-08-27 05:46:09 | 0:13:19 | 0:07:19 | 0:06:00 | smithi | main | centos | 8.stream | rados/objectstore/{backends/objectstore-filestore-memstore supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi088 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995532 | 2022-08-27 04:04:35 | 2022-08-27 05:32:50 | 2022-08-27 05:49:27 | 0:16:37 | 0:07:28 | 0:09:09 | smithi | main | centos | 8.stream | rados/thrash-old-clients/{0-distro$/{centos_8.stream_container_tools} 0-size-min-size-overrides/3-size-2-min-size 1-install/nautilus-v2only backoff/normal ceph clusters/{openstack three-plus-one} d-balancer/crush-compat mon_election/connectivity msgr-failures/few rados thrashers/none thrashosds-health workloads/radosbench} | 3 | |
Failure Reason:
Command failed on smithi105 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995533 | 2022-08-27 04:04:36 | 2022-08-27 05:34:01 | 2022-08-27 05:51:21 | 0:17:20 | 0:07:02 | 0:10:18 | smithi | main | ubuntu | 20.04 | rados/perf/{ceph mon_election/connectivity objectstore/bluestore-stupid openstack scheduler/wpq_default_shards settings/optimized ubuntu_latest workloads/fio_4M_rand_rw} | 1 | |
Failure Reason:
Command failed on smithi079 with status 1: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0' |
||||||||||||||
fail | 6995534 | 2022-08-27 04:04:37 | 2022-08-27 05:34:01 | 2022-08-27 05:52:19 | 0:18:18 | 0:07:18 | 0:11:00 | smithi | main | centos | 8.stream | rados/singleton-nomsgr/{all/pool-access mon_election/classic rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed on smithi027 with status 1: 'sudo yum -y install ceph-test' |
||||||||||||||
fail | 6995535 | 2022-08-27 04:04:39 | 2022-08-27 05:36:52 | 2022-08-27 05:55:20 | 0:18:28 | 0:07:41 | 0:10:47 | smithi | main | ubuntu | 20.04 | rados/singleton/{all/resolve_stuck_peering mon_election/classic msgr-failures/many msgr/async-v2only objectstore/bluestore-stupid rados supported-random-distro$/{ubuntu_latest}} | 2 | |
Failure Reason:
{'smithi174.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi174.front.sepia.ceph.com,172.21.15.174' (ECDSA) to the list of known hosts.\r\nubuntu@smithi174.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}, 'smithi191.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi191.front.sepia.ceph.com,172.21.15.191' (ECDSA) to the list of known hosts.\r\nubuntu@smithi191.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}} |
||||||||||||||
fail | 6995536 | 2022-08-27 04:04:40 | 2022-08-27 05:37:52 | 2022-08-27 05:54:42 | 0:16:50 | 0:08:29 | 0:08:21 | smithi | main | rhel | 8.6 | rados/cephadm/workunits/{0-distro/rhel_8.6_container_tools_3.0 agent/on mon_election/classic task/test_orch_cli_mon} | 5 | |
Failure Reason:
{'smithi044.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi044.front.sepia.ceph.com,172.21.15.44' (ECDSA) to the list of known hosts.\r\nubuntu@smithi044.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}, 'smithi003.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi003.front.sepia.ceph.com,172.21.15.3' (ECDSA) to the list of known hosts.\r\nubuntu@smithi003.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}, 'smithi096.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi096.front.sepia.ceph.com,172.21.15.96' (ECDSA) to the list of known hosts.\r\nubuntu@smithi096.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}, 'smithi081.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi081.front.sepia.ceph.com,172.21.15.81' (ECDSA) to the list of known hosts.\r\nubuntu@smithi081.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}, 'smithi179.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi179.front.sepia.ceph.com,172.21.15.179' (ECDSA) to the list of known hosts.\r\nubuntu@smithi179.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}} |
||||||||||||||
dead | 6995537 | 2022-08-27 04:04:41 | 2022-08-27 05:39:43 | 2022-08-27 06:01:40 | 0:21:57 | 0:14:49 | 0:07:08 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code-big/{ceph cluster/{12-osds openstack} mon_election/classic msgr-failures/fastclose objectstore/bluestore-low-osd-mem-target rados recovery-overrides/{more-async-partial-recovery} supported-random-distro$/{rhel_8} thrashers/mapgap thrashosds-health workloads/ec-rados-plugin=jerasure-k=4-m=2} | 3 | |
Failure Reason:
{'smithi070.front.sepia.ceph.com': {'_ansible_no_log': False, 'changed': False, 'failures': ['No package qemu-kvm-block-rbd available.'], 'invocation': {'module_args': {'allow_downgrade': False, 'allowerasing': False, 'autoremove': False, 'bugfix': False, 'conf_file': None, 'disable_excludes': None, 'disable_gpg_check': False, 'disable_plugin': [], 'disablerepo': [], 'download_dir': None, 'download_only': False, 'enable_plugin': [], 'enablerepo': [], 'exclude': [], 'install_repoquery': True, 'install_weak_deps': True, 'installroot': '/', 'list': None, 'lock_timeout': 30, 'name': ['dnf-utils', 'git-all', 'sysstat', 'libedit', 'boost-thread', 'xfsprogs', 'gdisk', 'parted', 'libgcrypt', 'fuse-libs', 'openssl', 'libuuid', 'attr', 'ant', 'lsof', 'gettext', 'bc', 'xfsdump', 'blktrace', 'usbredir', 'podman', 'redhat-lsb', 'firewalld', 'wget', 'libev-devel', 'valgrind', 'nfs-utils', 'ncurses-devel', 'gcc', 'git', 'make', 'python3-nose', 'python3-virtualenv', 'genisoimage', 'qemu-img', 'qemu-kvm-core', 'qemu-kvm-block-rbd', 'libacl-devel', 'autoconf', 'gdb', 'iozone', 'lvm2'], 'releasever': None, 'security': False, 'skip_broken': False, 'state': 'present', 'update_cache': False, 'update_only': False, 'validate_certs': True}}, 'msg': 'Failed to install some of the specified packages', 'rc': 1, 'results': []}, 'smithi190.front.sepia.ceph.com': {'_ansible_no_log': False, 'changed': False, 'failures': ['No package qemu-kvm-block-rbd available.'], 'invocation': {'module_args': {'allow_downgrade': False, 'allowerasing': False, 'autoremove': False, 'bugfix': False, 'conf_file': None, 'disable_excludes': None, 'disable_gpg_check': False, 'disable_plugin': [], 'disablerepo': [], 'download_dir': None, 'download_only': False, 'enable_plugin': [], 'enablerepo': [], 'exclude': [], 'install_repoquery': True, 'install_weak_deps': True, 'installroot': '/', 'list': None, 'lock_timeout': 30, 'name': ['dnf-utils', 'git-all', 'sysstat', 'libedit', 'boost-thread', 'xfsprogs', 'gdisk', 'parted', 'libgcrypt', 'fuse-libs', 'openssl', 'libuuid', 'attr', 'ant', 'lsof', 'gettext', 'bc', 'xfsdump', 'blktrace', 'usbredir', 'podman', 'redhat-lsb', 'firewalld', 'wget', 'libev-devel', 'valgrind', 'nfs-utils', 'ncurses-devel', 'gcc', 'git', 'make', 'python3-nose', 'python3-virtualenv', 'genisoimage', 'qemu-img', 'qemu-kvm-core', 'qemu-kvm-block-rbd', 'libacl-devel', 'autoconf', 'gdb', 'iozone', 'lvm2'], 'releasever': None, 'security': False, 'skip_broken': False, 'state': 'present', 'update_cache': False, 'update_only': False, 'validate_certs': True}}, 'msg': 'Failed to install some of the specified packages', 'rc': 1, 'results': []}} |
||||||||||||||
fail | 6995538 | 2022-08-27 04:04:42 | 2022-08-27 05:40:54 | 2022-08-27 05:59:25 | 0:18:31 | 0:09:48 | 0:08:43 | smithi | main | centos | 8.stream | rados/thrash-erasure-code-isa/{arch/x86_64 ceph clusters/{fixed-2 openstack} mon_election/classic msgr-failures/fastclose objectstore/bluestore-low-osd-mem-target rados recovery-overrides/{more-active-recovery} supported-random-distro$/{centos_8} thrashers/morepggrow thrashosds-health workloads/ec-rados-plugin=isa-k=2-m=1} | 2 | |
Failure Reason:
{'smithi163.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi163.front.sepia.ceph.com,172.21.15.163' (ECDSA) to the list of known hosts.\r\nubuntu@smithi163.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}, 'smithi049.front.sepia.ceph.com': {'changed': False, 'msg': 'All items completed', 'results': [{'_ansible_item_label': 'zcerza', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'zcerza', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'zcerza', 'name': 'zcerza', 'state': 'absent'}, {'_ansible_item_label': 'aschoen', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'aschoen', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'aschoen', 'name': 'aschoen', 'state': 'absent'}, {'_ansible_item_label': 'andrew', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'andrew', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'andrew', 'name': 'andrew', 'state': 'absent'}, {'_ansible_item_label': 'sweil', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'sweil', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'sweil', 'name': 'sweil', 'state': 'absent'}, {'_ansible_item_label': 'brad', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'brad', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'brad', 'name': 'brad', 'state': 'absent'}, {'_ansible_item_label': 'kefu', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'kefu', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'kefu', 'name': 'kefu', 'state': 'absent'}, {'_ansible_item_label': 'shylesh', 'ansible_loop_var': 'item', 'item': 'shylesh', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi049.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'gmeno', 'ansible_loop_var': 'item', 'item': 'gmeno', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi049.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'alfredodeza', 'ansible_loop_var': 'item', 'item': 'alfredodeza', 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi049.front.sepia.ceph.com,172.21.15.49' (ECDSA) to the list of known hosts.\r\nubuntu@smithi049.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive).", 'unreachable': True}, {'_ansible_item_label': 'vumrao', 'ansible_loop_var': 'item', 'item': 'vumrao', 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi049.front.sepia.ceph.com,172.21.15.49' (ECDSA) to the list of known hosts.\r\nubuntu@smithi049.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive).", 'unreachable': True}, {'_ansible_item_label': 'trhoden', 'ansible_loop_var': 'item', 'item': 'trhoden', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi049.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'nishtha', 'ansible_loop_var': 'item', 'item': 'nishtha', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi049.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'yguang', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'yguang', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'yguang', 'name': 'yguang', 'state': 'absent'}, {'_ansible_item_label': 'sdieffen', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'sdieffen', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'sdieffen', 'name': 'sdieffen', 'state': 'absent'}, {'_ansible_item_label': 'brian', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'brian', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'brian', 'name': 'brian', 'state': 'absent'}, {'_ansible_item_label': 'pmcgarry', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'pmcgarry', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'pmcgarry', 'name': 'pmcgarry', 'state': 'absent'}, {'_ansible_item_label': 'karnan', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'karnan', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'karnan', 'name': 'karnan', 'state': 'absent'}, {'_ansible_item_label': 'ryneli', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'ryneli', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'ryneli', 'name': 'ryneli', 'state': 'absent'}, {'_ansible_item_label': 'dlambrig', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dlambrig', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dlambrig', 'name': 'dlambrig', 'state': 'absent'}, {'_ansible_item_label': 'icolle', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'icolle', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'icolle', 'name': 'icolle', 'state': 'absent'}, {'_ansible_item_label': 'soumya', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'soumya', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'soumya', 'name': 'soumya', 'state': 'absent'}, {'_ansible_item_label': 'jspray', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jspray', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jspray', 'name': 'jspray', 'state': 'absent'}, {'_ansible_item_label': 'erwan', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'erwan', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'erwan', 'name': 'erwan', 'state': 'absent'}, {'_ansible_item_label': 'jj', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jj', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jj', 'name': 'jj', 'state': 'absent'}, {'_ansible_item_label': 'amarangone', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'amarangone', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'amarangone', 'name': 'amarangone', 'state': 'absent'}, {'_ansible_item_label': 'oprypin', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'oprypin', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'oprypin', 'name': 'oprypin', 'state': 'absent'}, {'_ansible_item_label': 'adamyanova', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'adamyanova', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'adamyanova', 'name': 'adamyanova', 'state': 'absent'}, {'_ansible_item_label': 'sbillah', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'sbillah', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'sbillah', 'name': 'sbillah', 'state': 'absent'}, {'_ansible_item_label': 'onyb', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'onyb', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'onyb', 'name': 'onyb', 'state': 'absent'}, {'_ansible_item_label': 'jwilliamson', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jwilliamson', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jwilliamson', 'name': 'jwilliamson', 'state': 'absent'}, {'_ansible_item_label': 'kmroz', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'kmroz', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'kmroz', 'name': 'kmroz', 'state': 'absent'}, {'_ansible_item_label': 'shehbazj', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'shehbazj', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'shehbazj', 'name': 'shehbazj', 'state': 'absent'}, {'_ansible_item_label': 'abhishekvrshny', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'abhishekvrshny', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi101', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'abhishekvrshny', 'name': 'abhishekvrshny', 'state': 'absent'}, {'_ansible_item_label': 'asheplyakov', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'asheplyakov', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'asheplyakov', 'name': 'asheplyakov', 'state': 'absent'}, {'_ansible_item_label': 'liupan', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'liupan', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'liupan', 'name': 'liupan', 'state': 'absent'}, {'_ansible_item_label': 'adeza', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'adeza', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'adeza', 'name': 'adeza', 'state': 'absent'}, {'_ansible_item_label': 'pranith', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'pranith', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'pranith', 'name': 'pranith', 'state': 'absent'}, {'_ansible_item_label': 'dorinda', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dorinda', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dorinda', 'name': 'dorinda', 'state': 'absent'}, {'_ansible_item_label': 'zyan', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'zyan', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'zyan', 'name': 'zyan', 'state': 'absent'}, {'_ansible_item_label': 'jdillaman', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jdillaman', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jdillaman', 'name': 'jdillaman', 'state': 'absent'}, {'_ansible_item_label': 'davidz', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'davidz', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'davidz', 'name': 'davidz', 'state': 'absent'}, {'_ansible_item_label': 'wusui', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'wusui', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'wusui', 'name': 'wusui', 'state': 'absent'}, {'_ansible_item_label': 'nwatkins', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'nwatkins', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'nwatkins', 'name': 'nwatkins', 'state': 'absent'}, {'_ansible_item_label': 'sidharthanup', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'sidharthanup', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'sidharthanup', 'name': 'sidharthanup', 'state': 'absent'}, {'_ansible_item_label': 'varsha', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'varsha', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'varsha', 'name': 'varsha', 'state': 'absent'}, {'_ansible_item_label': 'hmunjulu', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'hmunjulu', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'hmunjulu', 'name': 'hmunjulu', 'state': 'absent'}, {'_ansible_item_label': 'jlopez', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jlopez', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jlopez', 'name': 'jlopez', 'state': 'absent'}, {'_ansible_item_label': 'dfuller', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dfuller', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dfuller', 'name': 'dfuller', 'state': 'absent'}, {'_ansible_item_label': 'vasu', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'vasu', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'vasu', 'name': 'vasu', 'state': 'absent'}, {'_ansible_item_label': 'swagner', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'swagner', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'swagner', 'name': 'swagner', 'state': 'absent'}, {'_ansible_item_label': 'dpivonka', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dpivonka', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dpivonka', 'name': 'dpivonka', 'state': 'absent'}, {'_ansible_item_label': 'nlevine', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'nlevine', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'nlevine', 'name': 'nlevine', 'state': 'absent'}, {'_ansible_item_label': 'tbrekke', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'tbrekke', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'tbrekke', 'name': 'tbrekke', 'state': 'absent'}, {'_ansible_item_label': 'taco', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'taco', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'taco', 'name': 'taco', 'state': 'absent'}, {'_ansible_item_label': 'louis', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'louis', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'louis', 'name': 'louis', 'state': 'absent'}, {'_ansible_item_label': 'amarango', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'amarango', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'amarango', 'name': 'amarango', 'state': 'absent'}, {'_ansible_item_label': 'oobe', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'oobe', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'oobe', 'name': 'oobe', 'state': 'absent'}, {'_ansible_item_label': 'rturk', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'rturk', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'rturk', 'name': 'rturk', 'state': 'absent'}, {'_ansible_item_label': 'fche', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'fche', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'fche', 'name': 'fche', 'state': 'absent'}, {'_ansible_item_label': 'jbainbri', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jbainbri', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jbainbri', 'name': 'jbainbri', 'state': 'absent'}, {'_ansible_item_label': 'kdhananj', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'kdhananj', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi049', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'kdhananj', 'name': 'kdhananj', 'state': 'absent'}]}} |
||||||||||||||
fail | 6995539 | 2022-08-27 04:04:43 | 2022-08-27 05:40:54 | 2022-08-27 05:57:01 | 0:16:07 | 0:06:39 | 0:09:28 | smithi | main | ubuntu | 20.04 | rados/thrash/{0-size-min-size-overrides/3-size-2-min-size 1-pg-log-overrides/short_pg_log 2-recovery-overrides/{default} 3-scrub-overrides/{max-simultaneous-scrubs-2} backoff/peering ceph clusters/{fixed-2 openstack} crc-failures/default d-balancer/on mon_election/connectivity msgr-failures/fastclose msgr/async-v1only objectstore/bluestore-hybrid rados supported-random-distro$/{ubuntu_latest} thrashers/morepggrow thrashosds-health workloads/snaps-few-objects} | 2 | |
Failure Reason:
{'smithi188.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi188.front.sepia.ceph.com port 22: No route to host'}, 'smithi107.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': 'Failed to set execute bit on remote files (rc: -13, err: )'}} |
||||||||||||||
fail | 6995540 | 2022-08-27 04:04:44 | 2022-08-27 05:41:45 | 2022-08-27 05:58:40 | 0:16:55 | 0:09:44 | 0:07:11 | smithi | main | centos | 8.stream | rados/mgr/{clusters/{2-node-mgr} debug/mgr mgr_ttl_cache/enable mon_election/classic random-objectstore$/{bluestore-comp-zstd} supported-random-distro$/{centos_8} tasks/progress} | 2 | |
Failure Reason:
{'smithi032.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "failed to transfer file to /home/teuthworker/.ansible/tmp/ansible-local-25222148lq_5/tmpa6bcbbjm /home/ubuntu/.ansible/tmp/ansible-tmp-1661579450.3441963-15216-45921812265481/AnsiballZ_stat.py:\n\ndd: failed to open '/home/ubuntu/.ansible/tmp/ansible-tmp-1661579450.3441963-15216-45921812265481/AnsiballZ_stat.py': No such file or directory\n"}, 'smithi192.front.sepia.ceph.com': {'changed': False, 'msg': 'Data could not be sent to remote host "smithi192.front.sepia.ceph.com". Make sure this host can be reached over ssh: ssh: connect to host smithi192.front.sepia.ceph.com port 22: Connection refused\r\n', 'unreachable': True}} |
||||||||||||||
fail | 6995541 | 2022-08-27 04:04:46 | 2022-08-27 05:41:55 | 2022-08-27 05:56:21 | 0:14:26 | 0:07:21 | 0:07:05 | smithi | main | rhel | 8.6 | rados/multimon/{clusters/6 mon_election/connectivity msgr-failures/many msgr/async-v2only no_pools objectstore/bluestore-low-osd-mem-target rados supported-random-distro$/{rhel_8} tasks/mon_clock_with_skews} | 2 | |
Failure Reason:
{'smithi189.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi189.front.sepia.ceph.com,172.21.15.189' (ECDSA) to the list of known hosts.\r\nubuntu@smithi189.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}, 'smithi085.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "failed to transfer file to /home/teuthworker/.ansible/tmp/ansible-local-215837i9z1kni/tmpatfaudn7 /home/ubuntu/.ansible/tmp/ansible-tmp-1661579450.0133066-6828-69788993104262/AnsiballZ_authorized_key.py:\n\ndd: failed to open '/home/ubuntu/.ansible/tmp/ansible-tmp-1661579450.0133066-6828-69788993104262/AnsiballZ_authorized_key.py': No such file or directory\n"}} |
||||||||||||||
dead | 6995542 | 2022-08-27 04:04:47 | 2022-08-27 05:43:26 | 2022-08-27 05:52:39 | 0:09:13 | smithi | main | centos | 8.stream | rados/cephadm/osds/{0-distro/centos_8.stream_container_tools 0-nvme-loop 1-start 2-ops/rmdir-reactivate} | 2 | |||
Failure Reason:
SSH connection to smithi066 was lost: 'sudo yum install -y kernel' |
||||||||||||||
dead | 6995543 | 2022-08-27 04:04:48 | 2022-08-27 05:45:06 | 2022-08-27 05:54:33 | 0:09:27 | smithi | main | ubuntu | 20.04 | rados/singleton-nomsgr/{all/recovery-unfound-found mon_election/connectivity rados supported-random-distro$/{ubuntu_latest}} | 1 | |||
Failure Reason:
SSH connection to smithi037 was lost: 'sudo DEBIAN_FRONTEND=noninteractive apt-get -y install linux-image-generic' |
||||||||||||||
dead | 6995544 | 2022-08-27 04:04:49 | 2022-08-27 05:45:07 | 2022-08-27 05:52:42 | 0:07:35 | smithi | main | centos | 8.stream | rados/singleton/{all/test-crash mon_election/connectivity msgr-failures/none msgr/async objectstore/filestore-xfs rados supported-random-distro$/{centos_8}} | 1 | |||
Failure Reason:
SSH connection to smithi115 was lost: 'sudo yum install -y kernel' |
||||||||||||||
fail | 6995545 | 2022-08-27 04:04:50 | 2022-08-27 05:45:47 | 2022-08-27 05:52:18 | 0:06:31 | smithi | main | centos | 8.stream | rados/basic/{ceph clusters/{fixed-2 openstack} mon_election/connectivity msgr-failures/few msgr/async-v2only objectstore/bluestore-comp-zlib rados supported-random-distro$/{centos_8} tasks/rados_striper} | 2 | |||
Failure Reason:
Cannot connect to remote host smithi008 |
||||||||||||||
fail | 6995546 | 2022-08-27 04:04:51 | 2022-08-27 05:45:48 | 2022-08-27 05:58:54 | 0:13:06 | smithi | main | rhel | 8.6 | rados/thrash/{0-size-min-size-overrides/2-size-2-min-size 1-pg-log-overrides/normal_pg_log 2-recovery-overrides/{more-async-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-2} backoff/peering_and_degraded ceph clusters/{fixed-2 openstack} crc-failures/bad_map_crc_failure d-balancer/crush-compat mon_election/classic msgr-failures/few msgr/async-v2only objectstore/bluestore-low-osd-mem-target rados supported-random-distro$/{rhel_8} thrashers/none thrashosds-health workloads/write_fadvise_dontneed} | 2 | |||
Failure Reason:
machine smithi152.front.sepia.ceph.com is locked by scheduled_teuthology@teuthology, not scheduled_kchai@teuthology |
||||||||||||||
dead | 6995547 | 2022-08-27 04:04:53 | 2022-08-27 05:45:48 | 2022-08-27 05:50:49 | 0:05:01 | smithi | main | centos | 8.stream | rados/cephadm/smoke/{0-distro/centos_8.stream_container_tools_crun 0-nvme-loop agent/off fixed-2 mon_election/classic start} | 2 | |||
Failure Reason:
Error reimaging machines: SSH connection to smithi039 was lost: "while [ ! -e '/.cephlab_net_configured' ]; do sleep 5; done" |
||||||||||||||
fail | 6995548 | 2022-08-27 04:04:54 | 2022-08-27 05:45:48 | 2022-08-27 05:58:46 | 0:12:58 | smithi | main | ubuntu | 20.04 | rados/monthrash/{ceph clusters/3-mons mon_election/connectivity msgr-failures/mon-delay msgr/async-v1only objectstore/bluestore-comp-snappy rados supported-random-distro$/{ubuntu_latest} thrashers/force-sync-many workloads/snaps-few-objects} | 2 | |||
Failure Reason:
machine smithi138.front.sepia.ceph.com is locked by scheduled_teuthology@teuthology, not scheduled_kchai@teuthology |
||||||||||||||
fail | 6995549 | 2022-08-27 04:04:55 | 2022-08-27 05:46:19 | 2022-08-27 05:58:56 | 0:12:37 | smithi | main | ubuntu | 20.04 | rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/none cluster/3-node k8s/1.21 net/host rook/1.7.2} | 3 | |||
Failure Reason:
machine smithi088.front.sepia.ceph.com is locked by scheduled_teuthology@teuthology, not scheduled_kchai@teuthology |
||||||||||||||
fail | 6995550 | 2022-08-27 04:04:56 | 2022-08-27 05:46:49 | 2022-08-27 05:57:47 | 0:10:58 | smithi | main | ubuntu | 20.04 | rados/singleton/{all/test-noautoscale-flag mon_election/classic msgr-failures/few msgr/async-v1only objectstore/bluestore-bitmap rados supported-random-distro$/{ubuntu_latest}} | 1 | |||
Failure Reason:
machine smithi139.front.sepia.ceph.com is locked by scheduled_teuthology@teuthology, not scheduled_kchai@teuthology |
||||||||||||||
fail | 6995551 | 2022-08-27 04:04:57 | 2022-08-27 05:47:00 | 2022-08-27 05:57:34 | 0:10:34 | smithi | main | centos | 8.stream | rados/singleton-nomsgr/{all/version-number-sanity mon_election/classic rados supported-random-distro$/{centos_8}} | 1 | |||
Failure Reason:
machine smithi104.front.sepia.ceph.com is locked by scheduled_teuthology@teuthology, not scheduled_kchai@teuthology |
||||||||||||||
fail | 6995552 | 2022-08-27 04:04:58 | 2022-08-27 05:47:00 | 2022-08-27 05:58:46 | 0:11:46 | smithi | main | ubuntu | 20.04 | rados/perf/{ceph mon_election/classic objectstore/bluestore-basic-min-osd-mem-target openstack scheduler/dmclock_1Shard_16Threads settings/optimized ubuntu_latest workloads/fio_4M_rand_write} | 1 | |||
Failure Reason:
machine smithi125.front.sepia.ceph.com is locked by scheduled_teuthology@teuthology, not scheduled_kchai@teuthology |
||||||||||||||
fail | 6995553 | 2022-08-27 04:05:00 | 2022-08-27 05:47:51 | 2022-08-27 05:59:06 | 0:11:15 | smithi | main | rhel | 8.6 | rados/thrash-erasure-code/{ceph clusters/{fixed-2 openstack} fast/normal mon_election/classic msgr-failures/fastclose objectstore/bluestore-stupid rados recovery-overrides/{more-async-recovery} supported-random-distro$/{rhel_8} thrashers/fastread thrashosds-health workloads/ec-small-objects} | 2 | |||
Failure Reason:
machine smithi178.front.sepia.ceph.com is locked by scheduled_teuthology@teuthology, not scheduled_kchai@teuthology |
||||||||||||||
fail | 6995554 | 2022-08-27 04:05:01 | 2022-08-27 05:49:31 | 2022-08-27 06:05:37 | 0:16:06 | 0:07:57 | 0:08:09 | smithi | main | rhel | 8.6 | rados/cephadm/workunits/{0-distro/rhel_8.6_container_tools_rhel8 agent/off mon_election/connectivity task/test_adoption} | 1 | |
Failure Reason:
{'smithi105.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi105.front.sepia.ceph.com,172.21.15.105' (ECDSA) to the list of known hosts.\r\nubuntu@smithi105.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive)."}} |
||||||||||||||
fail | 6995555 | 2022-08-27 04:05:02 | 2022-08-27 05:49:32 | 2022-08-27 06:01:42 | 0:12:10 | 0:04:43 | 0:07:27 | smithi | main | centos | 8.stream | rados/thrash/{0-size-min-size-overrides/3-size-2-min-size 1-pg-log-overrides/short_pg_log 2-recovery-overrides/{more-async-recovery} 3-scrub-overrides/{max-simultaneous-scrubs-3} backoff/normal ceph clusters/{fixed-2 openstack} crc-failures/default d-balancer/on mon_election/connectivity msgr-failures/osd-delay msgr/async objectstore/bluestore-stupid rados supported-random-distro$/{centos_8} thrashers/pggrow thrashosds-health workloads/admin_socket_objecter_requests} | 2 | |
Failure Reason:
{'smithi061.front.sepia.ceph.com': {'_ansible_no_log': False, 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi061.front.sepia.ceph.com port 22: No route to host'}, 'smithi105.front.sepia.ceph.com': {'_ansible_no_log': False, 'changed': False, 'invocation': {'module_args': {'ignore_selinux_state': False, 'name': 'nagios_run_sudo', 'persistent': True, 'state': True}}, 'msg': 'Failed to manage policy for boolean nagios_run_sudo: [Errno 11] Resource temporarily unavailable'}} |
||||||||||||||
unknown | 6995568 | 2022-08-27 04:05:18 | 2022-08-27 04:05:18 | smithi | main | ubuntu | 20.04 | rados/thrash-erasure-code-big/{ceph cluster/{12-osds openstack} mon_election/connectivity msgr-failures/few objectstore/bluestore-stupid rados recovery-overrides/{more-async-recovery} supported-random-distro$/{ubuntu_latest} thrashers/morepggrow thrashosds-health workloads/ec-rados-plugin=lrc-k=4-m=2-l=3} | — |