Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
pass 4609289 2019-12-16 22:09:11 2019-12-16 22:10:19 2019-12-16 22:50:19 0:40:00 0:29:01 0:10:59 smithi master centos 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/rados_workunit_loadgen_mostlyread.yaml} 2
pass 4609290 2019-12-16 22:09:12 2019-12-16 22:10:20 2019-12-16 22:50:19 0:39:59 0:28:14 0:11:45 smithi master centos 8.0 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/few.yaml objectstore/bluestore-comp.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/minsize_recovery.yaml thrashosds-health.yaml workloads/ec-small-objects-fast-read.yaml} 2
pass 4609291 2019-12-16 22:09:13 2019-12-16 22:10:21 2019-12-16 22:32:20 0:21:59 0:10:43 0:11:16 smithi master rhel 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-comp.yaml supported-random-distro$/{rhel_8.yaml} tasks/workunits.yaml} 2
pass 4609292 2019-12-16 22:09:15 2019-12-16 22:10:22 2019-12-16 22:30:21 0:19:59 0:11:44 0:08:15 smithi master ubuntu 18.04 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-async-partial-recovery.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/osd-delay.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/none.yaml thrashosds-health.yaml workloads/redirect.yaml} 2
pass 4609293 2019-12-16 22:09:16 2019-12-16 22:12:09 2019-12-16 22:30:08 0:17:59 0:10:51 0:07:08 smithi master rhel 8.0 rados/singleton/{all/pg-removal-interruption.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
fail 4609294 2019-12-16 22:09:17 2019-12-16 22:12:09 2019-12-16 22:32:08 0:19:59 0:10:31 0:09:28 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-stupid.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/radosbench_4K_rand_read.yaml} 1
Failure Reason:

Command failed on smithi157 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

fail 4609295 2019-12-16 22:09:18 2019-12-16 22:12:09 2019-12-16 23:02:09 0:50:00 0:04:26 0:45:34 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-install/mimic.yaml backoff/normal.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/crush-compat.yaml distro$/{centos_latest.yaml} msgr-failures/few.yaml rados.yaml thrashers/default.yaml thrashosds-health.yaml workloads/rbd_cls.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=mimic

fail 4609296 2019-12-16 22:09:19 2019-12-16 22:12:21 2019-12-16 23:08:20 0:55:59 0:45:27 0:10:32 smithi master centos 8.0 rados/dashboard/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-stupid.yaml supported-random-distro$/{centos_8.yaml} tasks/dashboard.yaml} 2
Failure Reason:

Test failure: test_all (tasks.mgr.dashboard.test_rgw.RgwBucketTest)

pass 4609297 2019-12-16 22:09:20 2019-12-16 22:12:21 2019-12-16 22:32:20 0:19:59 0:10:06 0:09:53 smithi master centos 8.0 rados/objectstore/{backends/keyvaluedb.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609298 2019-12-16 22:09:21 2019-12-16 22:12:25 2019-12-16 22:36:24 0:23:59 0:17:10 0:06:49 smithi master rhel 8.0 rados/singleton-nomsgr/{all/msgr.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609299 2019-12-16 22:09:22 2019-12-16 23:11:27 2019-12-16 23:47:26 0:35:59 0:28:24 0:07:35 smithi master ubuntu 18.04 rados/singleton/{all/radostool.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609300 2019-12-16 22:09:23 2019-12-16 23:12:14 2019-12-16 23:34:13 0:21:59 0:14:39 0:07:20 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-partial-recovery.yaml} backoff/peering.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/redirect_promote_tests.yaml} 2
fail 4609301 2019-12-16 22:09:24 2019-12-16 23:13:57 2019-12-16 23:33:56 0:19:59 0:13:24 0:06:35 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/readwrite.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi137.front.sepia.ceph.com: ['type=AVC msg=audit(1576538955.813:6532): avc: denied { map } for pid=28482 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538955.889:6534): avc: denied { write } for pid=28482 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576538955.812:6529): avc: denied { open } for pid=28482 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.001:6536): avc: denied { setattr } for pid=28482 comm="rhsmcertd-worke" name="de4244cf84e6507539211d9a8fd50f353114f46361f6f5ac6a537392f8d8282d-primary.xml.gz" dev="sda1" ino=262180 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538955.889:6534): avc: denied { add_name } for pid=28482 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576538955.812:6531): avc: denied { getattr } for pid=28482 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538955.812:6529): avc: denied { read write } for pid=28482 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.242:6538): avc: denied { read } for pid=28482 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.242:6538): avc: denied { open } for pid=28482 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538955.888:6533): avc: denied { open } for pid=28482 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.158:6537): avc: denied { remove_name } for pid=28482 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59646 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576538955.812:6530): avc: denied { lock } for pid=28482 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.450:6541): avc: denied { map } for pid=28583 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=262251 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538955.889:6534): avc: denied { open } for pid=28482 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=59646 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.450:6539): avc: denied { open } for pid=28583 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.158:6537): avc: denied { unlink } for pid=28482 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59646 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538955.889:6534): avc: denied { create } for pid=28482 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.450:6540): avc: denied { lock } for pid=28583 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.450:6539): avc: denied { read } for pid=28583 comm="setroubleshootd" name="Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576538956.001:6535): avc: denied { open } for pid=28482 comm="rhsmcertd-worke" path="/var/cache/dnf/ceph-8a6e408a4cef42be/repodata/repomd.xml" dev="sda1" ino=262158 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1']

pass 4609302 2019-12-16 22:09:26 2019-12-16 23:14:40 2019-12-16 23:48:39 0:33:59 0:23:38 0:10:21 smithi master centos 8.0 rados/singleton/{all/random-eio.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 2
fail 4609303 2019-12-16 22:09:27 2019-12-16 23:14:47 2019-12-17 04:18:50 5:04:03 4:57:41 0:06:22 smithi master rhel 8.0 rados/standalone/{supported-random-distro$/{rhel_8.yaml} workloads/osd.yaml} 1
Failure Reason:

Command failed (workunit test osd/osd-rep-recov-eio.sh) on smithi005 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/osd/osd-rep-recov-eio.sh'

fail 4609304 2019-12-16 22:09:28 2019-12-16 23:16:23 2019-12-16 23:32:21 0:15:58 0:09:08 0:06:50 smithi master rhel 8.0 rados/perf/{ceph.yaml objectstore/bluestore-basic-min-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{rhel_8.yaml} workloads/radosbench_4K_seq_read.yaml} 1
Failure Reason:

Command failed on smithi139 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609305 2019-12-16 22:09:29 2019-12-16 23:16:24 2019-12-16 23:46:23 0:29:59 0:07:01 0:22:58 smithi master ubuntu 18.04 rados/multimon/{clusters/3.yaml msgr-failures/few.yaml msgr/async-v1only.yaml no_pools.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/mon_clock_with_skews.yaml} 2
pass 4609306 2019-12-16 22:09:30 2019-12-16 23:20:12 2019-12-17 00:00:12 0:40:00 0:26:38 0:13:22 smithi master ubuntu 18.04 rados/monthrash/{ceph.yaml clusters/9-mons.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/sync-many.yaml workloads/rados_mon_workunits.yaml} 2
fail 4609307 2019-12-16 22:09:31 2019-12-16 23:20:16 2019-12-16 23:52:15 0:31:59 0:24:04 0:07:55 smithi master rhel 8.0 rados/singleton-nomsgr/{all/multi-backfill-reject.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 2
Failure Reason:

SELinux denials found on ubuntu@smithi164.front.sepia.ceph.com: ['type=AVC msg=audit(1576540071.055:7033): avc: denied { read } for pid=30570 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.637:7026): avc: denied { getattr } for pid=30570 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540071.240:7036): avc: denied { map } for pid=30624 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=262251 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.972:7032): avc: denied { remove_name } for pid=30570 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59780 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576540070.636:7024): avc: denied { read write } for pid=30570 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540071.240:7034): avc: denied { read } for pid=30624 comm="setroubleshootd" name="Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.711:7029): avc: denied { open } for pid=30570 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=59780 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.711:7029): avc: denied { write } for pid=30570 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576540071.240:7035): avc: denied { lock } for pid=30624 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540071.055:7033): avc: denied { open } for pid=30570 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.972:7032): avc: denied { unlink } for pid=30570 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59780 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.711:7029): avc: denied { add_name } for pid=30570 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576540070.821:7031): avc: denied { setattr } for pid=30570 comm="rhsmcertd-worke" name="de4244cf84e6507539211d9a8fd50f353114f46361f6f5ac6a537392f8d8282d-primary.xml.gz" dev="sda1" ino=262180 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.711:7028): avc: denied { open } for pid=30570 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.711:7029): avc: denied { create } for pid=30570 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.637:7027): avc: denied { map } for pid=30570 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540071.240:7034): avc: denied { open } for pid=30624 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.636:7024): avc: denied { open } for pid=30570 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.636:7025): avc: denied { lock } for pid=30570 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540070.821:7030): avc: denied { open } for pid=30570 comm="rhsmcertd-worke" path="/var/cache/dnf/ceph-8a6e408a4cef42be/repodata/repomd.xml" dev="sda1" ino=262154 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1']

pass 4609308 2019-12-16 22:09:32 2019-12-16 23:21:30 2019-12-16 23:39:29 0:17:59 0:08:41 0:09:18 smithi master centos 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-low-osd-mem-target.yaml supported-random-distro$/{centos_8.yaml} tasks/cephadm_orchestrator.yaml} 2
dead 4609309 2019-12-16 22:09:33 2019-12-16 23:21:30 2019-12-17 11:23:58 12:02:28 smithi master rhel 8.0 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/fastclose.yaml rados.yaml recovery-overrides/{more-partial-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-small-objects-overwrites.yaml} 2
pass 4609310 2019-12-16 22:09:35 2019-12-16 23:21:30 2019-12-16 23:43:29 0:21:59 0:11:57 0:10:02 smithi master centos 8.0 rados/thrash-erasure-code-shec/{ceph.yaml clusters/{fixed-4.yaml openstack.yaml} msgr-failures/osd-delay.yaml objectstore/bluestore-avl.yaml rados.yaml recovery-overrides/{more-partial-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-rados-plugin=shec-k=4-m=3-c=2.yaml} 4
pass 4609311 2019-12-16 22:09:37 2019-12-16 23:22:07 2019-12-16 23:52:06 0:29:59 0:22:23 0:07:36 smithi master ubuntu 18.04 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/none.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
pass 4609312 2019-12-16 22:09:38 2019-12-16 23:23:30 2019-12-16 23:59:30 0:36:00 0:26:14 0:09:46 smithi master ubuntu 18.04 rados/thrash-erasure-code-big/{ceph.yaml cluster/{12-osds.yaml openstack.yaml} msgr-failures/few.yaml objectstore/bluestore-bitmap.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/ec-rados-plugin=jerasure-k=4-m=2.yaml} 3
pass 4609313 2019-12-16 22:09:39 2019-12-16 23:23:31 2019-12-16 23:55:30 0:31:59 0:24:03 0:07:56 smithi master rados/verify/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-thrash/default/{default.yaml thrashosds-health.yaml} msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-stupid.yaml rados.yaml tasks/rados_api_tests.yaml validater/lockdep.yaml} 2
pass 4609314 2019-12-16 22:09:40 2019-12-16 23:23:31 2019-12-16 23:43:30 0:19:59 0:11:10 0:08:49 smithi master ubuntu 18.04 rados/singleton/{all/rebuild-mondb.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609315 2019-12-16 22:09:41 2019-12-16 23:23:31 2019-12-16 23:47:30 0:23:59 0:16:42 0:07:17 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-async-partial-recovery.yaml} backoff/peering_and_degraded.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/crush-compat.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/redirect_set_object.yaml} 2
fail 4609316 2019-12-16 22:09:42 2019-12-16 23:23:31 2019-12-16 23:45:30 0:21:59 0:04:26 0:17:33 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-install/nautilus-v1only.yaml backoff/peering.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/off.yaml distro$/{centos_latest.yaml} msgr-failures/osd-delay.yaml rados.yaml thrashers/mapgap.yaml thrashosds-health.yaml workloads/snaps-few-objects.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=nautilus

fail 4609317 2019-12-16 22:09:43 2019-12-16 23:24:11 2019-12-16 23:38:10 0:13:59 0:06:27 0:07:32 smithi master ubuntu 18.04 rados/cephadm/{fixed-2.yaml mode/root.yaml msgr/async-v1only.yaml start.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/rados_python.yaml} 2
Failure Reason:

empty file

fail 4609318 2019-12-16 22:09:45 2019-12-16 23:24:14 2019-12-17 00:10:14 0:46:00 0:36:10 0:09:50 smithi master centos 8.0 rados/singleton-bluestore/{all/cephtool.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
Failure Reason:

Command failed (workunit test cephtool/test.sh) on smithi069 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh'

pass 4609319 2019-12-16 22:09:46 2019-12-16 23:24:24 2019-12-16 23:54:23 0:29:59 0:20:36 0:09:23 smithi master centos 8.0 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/osd-delay.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/ec-small-objects-many-deletes.yaml} 2
pass 4609320 2019-12-16 22:09:47 2019-12-16 23:25:15 2019-12-16 23:49:14 0:23:59 0:16:23 0:07:36 smithi master centos 8.0 rados/singleton/{all/recovery-preemption.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
fail 4609321 2019-12-16 22:09:49 2019-12-16 23:26:24 2019-12-16 23:44:23 0:17:59 0:08:00 0:09:59 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-bitmap.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/radosbench_4M_rand_read.yaml} 1
Failure Reason:

Command failed on smithi109 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609322 2019-12-16 22:09:52 2019-12-16 23:26:24 2019-12-16 23:52:23 0:25:59 0:16:09 0:09:50 smithi master centos 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/repair_test.yaml} 2
pass 4609323 2019-12-16 22:09:53 2019-12-16 23:26:43 2019-12-16 23:46:42 0:19:59 0:12:16 0:07:43 smithi master centos 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-async-recovery.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/osd-delay.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/set-chunk-promote-flush.yaml} 2
pass 4609324 2019-12-16 22:09:54 2019-12-16 23:27:14 2019-12-16 23:59:14 0:32:00 0:25:49 0:06:11 smithi master rhel 8.0 rados/objectstore/{backends/objectcacher-stress.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609325 2019-12-16 22:09:55 2019-12-16 23:27:14 2019-12-16 23:51:13 0:23:59 0:17:33 0:06:26 smithi master ubuntu 18.04 rados/singleton-nomsgr/{all/osd_stale_reads.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609326 2019-12-16 22:09:56 2019-12-16 23:27:14 2019-12-16 23:43:13 0:15:59 0:08:25 0:07:34 smithi master ubuntu 18.04 rados/singleton/{all/resolve_stuck_peering.yaml msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 2
fail 4609327 2019-12-16 22:09:58 2019-12-16 23:27:16 2019-12-17 00:01:15 0:33:59 0:22:11 0:11:48 smithi master ubuntu 18.04 rados/singleton/{all/test-crash.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
Failure Reason:

Command failed (workunit test rados/test_crash.sh) on smithi153 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test_crash.sh'

pass 4609328 2019-12-16 22:09:59 2019-12-16 23:27:16 2019-12-16 23:51:16 0:24:00 0:11:34 0:12:26 smithi master rhel 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-stupid.yaml supported-random-distro$/{rhel_8.yaml} tasks/crash.yaml} 2
pass 4609329 2019-12-16 22:10:00 2019-12-16 23:27:54 2019-12-16 23:51:54 0:24:00 0:15:24 0:08:36 smithi master centos 8.0 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-async-recovery.yaml} backoff/peering.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/mapgap.yaml thrashosds-health.yaml workloads/set-chunks-read.yaml} 2
fail 4609330 2019-12-16 22:10:01 2019-12-16 23:28:12 2019-12-16 23:48:11 0:19:59 0:09:16 0:10:43 smithi master rhel 8.0 rados/perf/{ceph.yaml objectstore/bluestore-comp.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{rhel_8.yaml} workloads/radosbench_4M_seq_read.yaml} 1
Failure Reason:

Command failed on smithi145 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

fail 4609331 2019-12-16 22:10:02 2019-12-16 23:28:14 2019-12-16 23:48:13 0:19:59 0:04:16 0:15:43 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-install/nautilus-v2only.yaml backoff/peering_and_degraded.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/crush-compat.yaml distro$/{centos_latest.yaml} msgr-failures/fastclose.yaml rados.yaml thrashers/morepggrow.yaml thrashosds-health.yaml workloads/test_rbd_api.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=nautilus

fail 4609332 2019-12-16 22:10:03 2019-12-16 23:28:21 2019-12-17 00:16:21 0:48:00 0:38:41 0:09:19 smithi master rhel 8.0 rados/dashboard/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/filestore-xfs.yaml supported-random-distro$/{rhel_8.yaml} tasks/dashboard.yaml} 2
Failure Reason:

Test failure: test_all (tasks.mgr.dashboard.test_rgw.RgwBucketTest)

fail 4609333 2019-12-16 22:10:05 2019-12-16 23:28:42 2019-12-16 23:50:41 0:21:59 0:09:24 0:12:35 smithi master centos 8.0 rados/singleton-nomsgr/{all/pool-access.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
Failure Reason:

Command failed (workunit test rados/test_pool_access.sh) on smithi067 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test_pool_access.sh'

pass 4609334 2019-12-16 22:10:06 2019-12-16 23:28:42 2019-12-16 23:54:41 0:25:59 0:15:59 0:10:00 smithi master ubuntu 18.04 rados/singleton/{all/test_envlibrados_for_rocksdb.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
fail 4609335 2019-12-16 22:10:07 2019-12-16 23:30:32 2019-12-16 23:48:31 0:17:59 0:11:55 0:06:04 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/rgw_snaps.yaml} 2
Failure Reason:

Command failed on smithi150 with status 1: 'cd /home/ubuntu/cephtest/s3-tests && ./bootstrap'

fail 4609336 2019-12-16 22:10:08 2019-12-16 23:33:12 2019-12-17 00:37:12 1:04:00 0:55:38 0:08:22 smithi master centos 8.0 rados/standalone/{supported-random-distro$/{centos_8.yaml} workloads/scrub.yaml} 1
Failure Reason:

Command failed (workunit test scrub/osd-scrub-repair.sh) on smithi200 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh'

pass 4609337 2019-12-16 22:10:09 2019-12-16 23:33:14 2019-12-16 23:53:13 0:19:59 0:12:25 0:07:34 smithi master ubuntu 18.04 rados/multimon/{clusters/6.yaml msgr-failures/many.yaml msgr/async-v2only.yaml no_pools.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/mon_recovery.yaml} 2
pass 4609338 2019-12-16 22:10:11 2019-12-16 23:34:08 2019-12-17 00:12:08 0:38:00 0:29:27 0:08:33 smithi master centos 8.0 rados/monthrash/{ceph.yaml clusters/3-mons.yaml msgr-failures/mon-delay.yaml msgr/async-v2only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/sync.yaml workloads/snaps-few-objects.yaml} 2
dead 4609339 2019-12-16 22:10:12 2019-12-16 23:34:08 2019-12-17 11:36:37 12:02:29 11:55:23 0:07:06 smithi master rhel 8.0 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/few.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-snaps-few-objects-overwrites.yaml} 2
Failure Reason:

psutil.NoSuchProcess process no longer exists (pid=17844)

pass 4609340 2019-12-16 22:10:13 2019-12-16 23:34:13 2019-12-17 00:06:14 0:32:01 0:11:46 0:20:15 smithi master ubuntu 18.04 rados/thrash-erasure-code-shec/{ceph.yaml clusters/{fixed-4.yaml openstack.yaml} msgr-failures/fastclose.yaml objectstore/bluestore-bitmap.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-rados-plugin=shec-k=4-m=3-c=2.yaml} 4
pass 4609341 2019-12-16 22:10:14 2019-12-16 23:34:13 2019-12-17 00:06:14 0:32:01 0:25:21 0:06:40 smithi master ubuntu 18.04 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/osd-delay.yaml objectstore/bluestore-stupid.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
pass 4609342 2019-12-16 22:10:15 2019-12-16 23:34:14 2019-12-17 00:02:14 0:28:00 0:13:53 0:14:07 smithi master rhel 8.0 rados/thrash-erasure-code-big/{ceph.yaml cluster/{12-osds.yaml openstack.yaml} msgr-failures/osd-delay.yaml objectstore/bluestore-comp.yaml rados.yaml recovery-overrides/{more-async-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/ec-rados-plugin=lrc-k=4-m=2-l=3.yaml} 3
fail 4609343 2019-12-16 22:10:16 2019-12-16 23:34:19 2019-12-16 23:56:18 0:21:59 0:10:27 0:11:32 smithi master centos rados/verify/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-thrash/none.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/filestore-xfs.yaml rados.yaml tasks/rados_cls_all.yaml validater/valgrind.yaml} 2
Failure Reason:

Command failed on smithi084 with status 1: 'sudo yum -y install ceph'

pass 4609344 2019-12-16 22:10:18 2019-12-16 23:34:26 2019-12-17 00:54:27 1:20:01 1:11:33 0:08:28 smithi master centos 8.0 rados/singleton/{all/thrash-backfill-full.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 2
pass 4609345 2019-12-16 22:10:19 2019-12-16 23:34:47 2019-12-17 00:04:46 0:29:59 0:20:53 0:09:06 smithi master centos 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-active-recovery.yaml} backoff/peering_and_degraded.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/crush-compat.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/small-objects.yaml} 2
pass 4609346 2019-12-16 22:10:20 2019-12-16 23:35:38 2019-12-17 00:03:37 0:27:59 0:20:21 0:07:38 smithi master ubuntu 18.04 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/fastclose.yaml objectstore/bluestore-stupid.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/ec-small-objects.yaml} 2
fail 4609347 2019-12-16 22:10:21 2019-12-16 23:35:38 2019-12-16 23:49:37 0:13:59 0:07:10 0:06:49 smithi master ubuntu 18.04 rados/perf/{ceph.yaml objectstore/bluestore-low-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{ubuntu_latest.yaml} workloads/radosbench_4M_write.yaml} 1
Failure Reason:

Command failed on smithi026 with status 1: '/home/ubuntu/cephtest/cbt/cbt.py -a /home/ubuntu/cephtest/archive/cbt /home/ubuntu/cephtest/archive/cbt/cbt_config.yaml'

pass 4609348 2019-12-16 22:10:22 2019-12-16 23:36:09 2019-12-17 03:54:13 4:18:04 4:06:32 0:11:32 smithi master ubuntu 18.04 rados/objectstore/{backends/objectstore.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609349 2019-12-16 22:10:24 2019-12-16 23:36:13 2019-12-17 00:10:14 0:34:01 0:27:58 0:06:03 smithi master ubuntu 18.04 rados/singleton-nomsgr/{all/recovery-unfound-found.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
dead 4609350 2019-12-16 22:10:25 2019-12-16 23:36:13 2019-12-17 11:38:41 12:02:28 smithi master rhel 8.0 rados/singleton/{all/thrash-eio.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 2
pass 4609351 2019-12-16 22:10:26 2019-12-16 23:36:15 2019-12-16 23:58:14 0:21:59 0:12:12 0:09:47 smithi master centos 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/filestore-xfs.yaml supported-random-distro$/{centos_8.yaml} tasks/failover.yaml} 2
pass 4609352 2019-12-16 22:10:27 2019-12-16 23:36:20 2019-12-17 00:10:19 0:33:59 0:22:48 0:11:11 smithi master centos 8.0 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-partial-recovery.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/osd-delay.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/none.yaml thrashosds-health.yaml workloads/snaps-few-objects.yaml} 2
fail 4609353 2019-12-16 22:10:28 2019-12-16 23:36:20 2019-12-16 23:52:19 0:15:59 0:04:15 0:11:44 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-install/nautilus.yaml backoff/normal.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/off.yaml distro$/{centos_latest.yaml} msgr-failures/few.yaml rados.yaml thrashers/none.yaml thrashosds-health.yaml workloads/cache-snaps.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=nautilus

pass 4609354 2019-12-16 22:10:29 2019-12-16 23:36:27 2019-12-17 00:04:26 0:27:59 0:19:50 0:08:09 smithi master ubuntu 18.04 rados/singleton/{all/thrash-rados/{thrash-rados.yaml thrashosds-health.yaml} msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 2
pass 4609355 2019-12-16 22:10:30 2019-12-16 23:37:17 2019-12-16 23:55:16 0:17:59 0:10:48 0:07:11 smithi master ubuntu 18.04 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/scrub_test.yaml} 2
pass 4609356 2019-12-16 22:10:32 2019-12-16 23:38:05 2019-12-17 00:06:04 0:27:59 0:20:43 0:07:16 smithi master ubuntu 18.04 rados/perf/{ceph.yaml objectstore/bluestore-stupid.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{ubuntu_latest.yaml} workloads/radosbench_omap_write.yaml} 1
pass 4609357 2019-12-16 22:10:33 2019-12-16 23:38:31 2019-12-16 23:54:30 0:15:59 0:08:50 0:07:09 smithi master ubuntu 18.04 rados/singleton-nomsgr/{all/version-number-sanity.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609358 2019-12-16 22:10:34 2019-12-16 23:38:31 2019-12-17 00:12:30 0:33:59 0:20:07 0:13:52 smithi master centos 8.0 rados/singleton/{all/thrash_cache_writeback_proxy_none.yaml msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 2
pass 4609359 2019-12-16 22:10:35 2019-12-16 23:39:17 2019-12-17 00:05:17 0:26:00 0:15:41 0:10:19 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-partial-recovery.yaml} backoff/peering.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/write_fadvise_dontneed.yaml} 2
pass 4609360 2019-12-16 22:10:36 2019-12-16 23:39:49 2019-12-16 23:59:48 0:19:59 0:08:29 0:11:30 smithi master centos 8.0 rados/singleton/{all/watch-notify-same-primary.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
fail 4609361 2019-12-16 22:10:37 2019-12-16 23:40:17 2019-12-16 23:58:16 0:17:59 0:07:56 0:10:03 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-basic-min-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/sample_fio.yaml} 1
Failure Reason:

Command failed on smithi102 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

fail 4609362 2019-12-16 22:10:38 2019-12-16 23:41:13 2019-12-17 03:17:15 3:36:02 3:27:12 0:08:50 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/rados_api_tests.yaml} 2
Failure Reason:

Command failed (workunit test rados/test_pool_quota.sh) on smithi181 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test_pool_quota.sh'

fail 4609363 2019-12-16 22:10:39 2019-12-16 23:41:16 2019-12-16 23:57:15 0:15:59 0:08:35 0:07:24 smithi master rhel 8.0 rados/cephadm/{fixed-2.yaml mode/packaged.yaml msgr/async-v2only.yaml start.yaml supported-random-distro$/{rhel_8.yaml} tasks/rados_api_tests.yaml} 2
Failure Reason:

Command failed on smithi125 with status 1: 'sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 82d47aae-205f-11ea-8271-001a4aab830c --force'

fail 4609364 2019-12-16 22:10:40 2019-12-16 23:43:37 2019-12-17 00:29:36 0:45:59 0:39:05 0:06:54 smithi master rhel 8.0 rados/dashboard/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-avl.yaml supported-random-distro$/{rhel_8.yaml} tasks/dashboard.yaml} 2
Failure Reason:

Test failure: test_all (tasks.mgr.dashboard.test_rgw.RgwBucketTest)

pass 4609365 2019-12-16 22:10:42 2019-12-16 23:43:37 2019-12-17 00:03:36 0:19:59 0:13:39 0:06:20 smithi master rhel 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-avl.yaml supported-random-distro$/{rhel_8.yaml} tasks/insights.yaml} 2
pass 4609366 2019-12-16 22:10:43 2019-12-16 23:43:37 2019-12-17 00:05:36 0:21:59 0:14:34 0:07:25 smithi master ubuntu 18.04 rados/monthrash/{ceph.yaml clusters/9-mons.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/force-sync-many.yaml workloads/pool-create-delete.yaml} 2
pass 4609367 2019-12-16 22:10:44 2019-12-16 23:43:37 2019-12-17 00:17:36 0:33:59 0:24:03 0:09:56 smithi master centos 8.0 rados/multimon/{clusters/21.yaml msgr-failures/few.yaml msgr/async.yaml no_pools.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/mon_recovery.yaml} 3
pass 4609368 2019-12-16 22:10:45 2019-12-16 23:44:24 2019-12-16 23:58:23 0:13:59 0:08:11 0:05:48 smithi master ubuntu 18.04 rados/objectstore/{backends/alloc-hint.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
fail 4609369 2019-12-16 22:10:46 2019-12-16 23:45:50 2019-12-17 00:01:49 0:15:59 0:09:00 0:06:59 smithi master ubuntu 18.04 rados/rest/{mgr-restful.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
Failure Reason:

Command failed (workunit test rest/test-restful.sh) on smithi149 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.a/client.a/tmp && cd -- /home/ubuntu/cephtest/mnt.a/client.a/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="a" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.a CEPH_ROOT=/home/ubuntu/cephtest/clone.client.a adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.a/qa/workunits/rest/test-restful.sh'

pass 4609370 2019-12-16 22:10:47 2019-12-16 23:45:58 2019-12-17 00:01:57 0:15:59 0:09:46 0:06:13 smithi master rhel 8.0 rados/singleton/{all/admin-socket.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
fail 4609371 2019-12-16 22:10:48 2019-12-16 23:46:13 2019-12-17 00:28:14 0:42:01 0:33:22 0:08:39 smithi master centos 8.0 rados/singleton-bluestore/{all/cephtool.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
Failure Reason:

Command failed (workunit test cephtool/test.sh) on smithi073 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh'

fail 4609372 2019-12-16 22:10:50 2019-12-16 23:46:24 2019-12-17 00:04:23 0:17:59 0:10:25 0:07:34 smithi master centos rados/singleton-flat/valgrind-leaks.yaml 1
Failure Reason:

Command failed on smithi013 with status 1: 'sudo yum -y install ceph'

fail 4609373 2019-12-16 22:10:51 2019-12-16 23:46:43 2019-12-17 00:06:42 0:19:59 0:13:05 0:06:54 smithi master ubuntu 18.04 rados/singleton-nomsgr/{all/admin_socket_output.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
Failure Reason:

"2019-12-16T23:57:14.307820+0000 mon.a (mon.0) 181 : cluster [WRN] Health check failed: 1 filesystem is degraded (FS_DEGRADED)" in cluster log

pass 4609374 2019-12-16 22:10:52 2019-12-16 23:47:08 2019-12-17 00:05:07 0:17:59 0:10:13 0:07:46 smithi master ubuntu 18.04 rados/standalone/{supported-random-distro$/{ubuntu_latest.yaml} workloads/crush.yaml} 1
fail 4609375 2019-12-16 22:10:53 2019-12-16 23:47:28 2019-12-17 00:11:27 0:23:59 0:17:12 0:06:47 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-active-recovery.yaml} backoff/peering_and_degraded.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/admin_socket_objecter_requests.yaml} 2
Failure Reason:

Command failed on smithi037 with status 127: '/home/ubuntu/cephtest/admin_socket_client.0/objecter_requests'

pass 4609376 2019-12-16 22:10:54 2019-12-16 23:47:32 2019-12-17 00:21:31 0:33:59 0:25:21 0:08:38 smithi master centos 8.0 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/few.yaml objectstore/bluestore-stupid.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-rados-plugin=clay-k=4-m=2.yaml} 2
fail 4609377 2019-12-16 22:10:55 2019-12-16 23:48:29 2019-12-17 08:46:39 8:58:10 8:50:41 0:07:29 smithi master rhel 8.0 rados/thrash-erasure-code-big/{ceph.yaml cluster/{12-osds.yaml openstack.yaml} msgr-failures/fastclose.yaml objectstore/bluestore-comp.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-rados-plugin=jerasure-k=4-m=2.yaml} 3
Failure Reason:

Command failed on smithi026 with status 124: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json'

dead 4609378 2019-12-16 22:10:56 2019-12-16 23:48:30 2019-12-17 11:51:00 12:02:30 smithi master centos 8.0 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/fastclose.yaml objectstore/bluestore-stupid.yaml rados.yaml recovery-overrides/{more-partial-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
fail 4609379 2019-12-16 22:10:58 2019-12-16 23:48:30 2019-12-17 03:46:33 3:58:03 3:48:29 0:09:34 smithi master centos 8.0 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/osd-delay.yaml rados.yaml recovery-overrides/{more-async-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/fastread.yaml thrashosds-health.yaml workloads/ec-pool-snaps-few-objects-overwrites.yaml} 2
Failure Reason:

Command failed on smithi081 with status 124: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json'

pass 4609380 2019-12-16 22:10:59 2019-12-16 23:48:30 2019-12-17 00:08:29 0:19:59 0:11:57 0:08:02 smithi master ubuntu 18.04 rados/thrash-erasure-code-shec/{ceph.yaml clusters/{fixed-4.yaml openstack.yaml} msgr-failures/few.yaml objectstore/bluestore-comp.yaml rados.yaml recovery-overrides/{more-partial-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-rados-plugin=shec-k=4-m=3-c=2.yaml} 4
fail 4609381 2019-12-16 22:11:00 2019-12-16 23:48:32 2019-12-17 00:06:31 0:17:59 0:04:18 0:13:41 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-install/hammer.yaml backoff/peering.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/crush-compat.yaml distro$/{centos_latest.yaml} msgr-failures/few.yaml rados.yaml thrashers/careful.yaml thrashosds-health.yaml workloads/cache-snaps.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=hammer

fail 4609382 2019-12-16 22:11:01 2019-12-16 23:48:41 2019-12-17 01:00:41 1:12:00 1:02:02 0:09:58 smithi master ubuntu 18.04 rados/upgrade/mimic-x-singleton/{0-cluster/{openstack.yaml start.yaml} 1-install/mimic.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 5-workload/{radosbench.yaml rbd_api.yaml} 6-finish-upgrade.yaml 7-nautilus.yaml 8-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} bluestore-bitmap.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashosds-health.yaml ubuntu_latest.yaml} 4
Failure Reason:

Command failed (workunit test rbd/test_librbd_python.sh) on smithi098 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rbd/test_librbd_python.sh'

pass 4609383 2019-12-16 22:11:02 2019-12-16 23:49:15 2019-12-17 00:09:15 0:20:00 0:12:28 0:07:32 smithi master rados/verify/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-thrash/default/{default.yaml thrashosds-health.yaml} msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-avl.yaml rados.yaml tasks/rados_cls_all.yaml validater/lockdep.yaml} 2
pass 4609384 2019-12-16 22:11:03 2019-12-16 23:49:59 2019-12-17 00:05:58 0:15:59 0:08:44 0:07:15 smithi master ubuntu 18.04 rados/singleton/{all/deduptool.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
fail 4609385 2019-12-16 22:11:05 2019-12-16 23:50:43 2019-12-17 00:06:42 0:15:59 0:09:19 0:06:40 smithi master rhel 8.0 rados/perf/{ceph.yaml objectstore/bluestore-bitmap.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{rhel_8.yaml} workloads/sample_radosbench.yaml} 1
Failure Reason:

Command failed on smithi152 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609386 2019-12-16 22:11:06 2019-12-16 23:51:34 2019-12-17 00:15:34 0:24:00 0:16:37 0:07:23 smithi master ubuntu 18.04 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-partial-recovery.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/crush-compat.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/cache-agent-big.yaml} 2
pass 4609387 2019-12-16 22:11:07 2019-12-16 23:51:35 2019-12-17 00:13:34 0:21:59 0:15:23 0:06:36 smithi master rhel 8.0 rados/singleton/{all/divergent_priors.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609388 2019-12-16 22:11:08 2019-12-16 23:51:56 2019-12-17 00:11:55 0:19:59 0:09:53 0:10:06 smithi master centos 8.0 rados/singleton-nomsgr/{all/balancer.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609389 2019-12-16 22:11:09 2019-12-16 23:52:08 2019-12-17 00:12:07 0:19:59 0:12:17 0:07:42 smithi master centos 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/rados_cls_all.yaml} 2
fail 4609390 2019-12-16 22:11:11 2019-12-16 23:52:17 2019-12-17 00:10:16 0:17:59 0:10:00 0:07:59 smithi master centos 8.0 rados/singleton/{all/divergent_priors2.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
Failure Reason:

SELinux denials found on ubuntu@smithi175.front.sepia.ceph.com: ['type=AVC msg=audit(1576540858.851:3645): avc: denied { lock } for pid=15188 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=657 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540857.848:3621): avc: denied { open } for pid=15150 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=657 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540857.848:3621): avc: denied { read } for pid=15150 comm="setroubleshootd" name="Packages" dev="sda1" ino=657 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540858.851:3646): avc: denied { map } for pid=15188 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=658 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540857.848:3623): avc: denied { map } for pid=15150 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=658 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540857.848:3622): avc: denied { lock } for pid=15150 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=657 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540858.850:3644): avc: denied { open } for pid=15188 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=657 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576540858.850:3644): avc: denied { read } for pid=15188 comm="rpm" name="Packages" dev="sda1" ino=657 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

fail 4609391 2019-12-16 22:11:12 2019-12-16 23:52:20 2019-12-17 00:12:20 0:20:00 0:04:46 0:15:14 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-install/jewel-v1only.yaml backoff/peering_and_degraded.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/off.yaml distro$/{centos_latest.yaml} msgr-failures/osd-delay.yaml rados.yaml thrashers/default.yaml thrashosds-health.yaml workloads/radosbench.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=jewel

fail 4609392 2019-12-16 22:11:13 2019-12-16 23:52:24 2019-12-17 00:08:24 0:16:00 0:08:04 0:07:56 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-bitmap.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/cosbench_64K_read_write.yaml} 1
Failure Reason:

Command failed on smithi040 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

fail 4609393 2019-12-16 22:11:14 2019-12-16 23:53:36 2019-12-17 00:17:34 0:23:58 0:16:09 0:07:49 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-active-recovery.yaml} backoff/peering.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/osd-delay.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/mapgap.yaml thrashosds-health.yaml workloads/cache-agent-small.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi167.front.sepia.ceph.com: ['type=AVC msg=audit(1576541352.163:8219): avc: denied { setattr } for pid=32874 comm="rhsmcertd-worke" name="de4244cf84e6507539211d9a8fd50f353114f46361f6f5ac6a537392f8d8282d-primary.xml.gz" dev="sda1" ino=262180 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541351.948:8204): avc: denied { read write } for pid=32874 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541351.948:8205): avc: denied { lock } for pid=32874 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.036:8212): avc: denied { create } for pid=32874 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.036:8212): avc: denied { write } for pid=32874 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576541352.163:8218): avc: denied { open } for pid=32874 comm="rhsmcertd-worke" path="/var/cache/dnf/ceph-8a6e408a4cef42be/repodata/repomd.xml" dev="sda1" ino=262154 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.036:8212): avc: denied { add_name } for pid=32874 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576541352.980:8250): avc: denied { lock } for pid=33134 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.313:8228): avc: denied { unlink } for pid=32874 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59779 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.395:8233): avc: denied { open } for pid=32874 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541351.948:8207): avc: denied { map } for pid=32874 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.036:8212): avc: denied { open } for pid=32874 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=59779 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541351.948:8206): avc: denied { getattr } for pid=32874 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.036:8211): avc: denied { open } for pid=32874 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.980:8251): avc: denied { map } for pid=33134 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=262251 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.313:8228): avc: denied { remove_name } for pid=32874 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59779 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576541352.980:8249): avc: denied { read } for pid=33134 comm="setroubleshootd" name="Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541351.948:8204): avc: denied { open } for pid=32874 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.395:8233): avc: denied { read } for pid=32874 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541352.980:8249): avc: denied { open } for pid=33134 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

pass 4609394 2019-12-16 22:11:15 2019-12-16 23:54:25 2019-12-17 00:26:24 0:31:59 0:25:35 0:06:24 smithi master ubuntu 18.04 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-bitmap.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/module_selftest.yaml} 2
pass 4609395 2019-12-16 22:11:17 2019-12-16 23:54:31 2019-12-17 00:10:30 0:15:59 0:08:49 0:07:10 smithi master ubuntu 18.04 rados/singleton/{all/dump-stuck.yaml msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609396 2019-12-16 22:11:18 2019-12-16 23:55:01 2019-12-17 00:13:03 0:18:02 0:12:25 0:05:37 smithi master ubuntu 18.04 rados/objectstore/{backends/ceph_objectstore_tool.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609397 2019-12-16 22:11:19 2019-12-16 23:55:01 2019-12-17 00:09:03 0:14:02 0:08:22 0:05:40 smithi master ubuntu 18.04 rados/singleton-nomsgr/{all/cache-fs-trunc.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609398 2019-12-16 22:11:20 2019-12-16 23:55:19 2019-12-17 00:27:20 0:32:01 0:23:41 0:08:20 smithi master ubuntu 18.04 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/osd-delay.yaml objectstore/filestore-xfs.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-rados-plugin=jerasure-k=2-m=1.yaml} 2
pass 4609399 2019-12-16 22:11:21 2019-12-16 23:55:32 2019-12-17 00:25:31 0:29:59 0:21:44 0:08:15 smithi master ubuntu 18.04 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-async-partial-recovery.yaml} backoff/peering_and_degraded.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/cache-pool-snaps-readproxy.yaml} 2
pass 4609400 2019-12-16 22:11:22 2019-12-16 23:56:46 2019-12-17 00:26:45 0:29:59 0:23:48 0:06:11 smithi master ubuntu 18.04 rados/singleton/{all/ec-lost-unfound.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609401 2019-12-16 22:11:23 2019-12-16 23:57:20 2019-12-17 00:19:17 0:21:57 0:11:36 0:10:21 smithi master centos 8.0 rados/thrash-erasure-code-big/{ceph.yaml cluster/{12-osds.yaml openstack.yaml} msgr-failures/few.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-rados-plugin=lrc-k=4-m=2-l=3.yaml} 3
fail 4609402 2019-12-16 22:11:24 2019-12-16 23:57:20 2019-12-17 00:17:17 0:19:57 0:10:36 0:09:21 smithi master centos rados/verify/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-thrash/none.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-bitmap.yaml rados.yaml tasks/mon_recovery.yaml validater/valgrind.yaml} 2
Failure Reason:

Command failed on smithi022 with status 1: 'sudo yum -y install ceph'

dead 4609403 2019-12-16 22:11:25 2019-12-16 23:58:54 2019-12-17 12:01:19 12:02:25 smithi master rhel 8.0 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml objectstore/filestore-xfs.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
pass 4609404 2019-12-16 22:11:26 2019-12-16 23:58:54 2019-12-17 00:28:51 0:29:57 0:22:33 0:07:24 smithi master rhel 8.0 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/fastclose.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/minsize_recovery.yaml thrashosds-health.yaml workloads/ec-small-objects-fast-read-overwrites.yaml} 2
pass 4609405 2019-12-16 22:11:27 2019-12-16 23:58:53 2019-12-17 00:20:52 0:21:59 0:13:50 0:08:09 smithi master rhel 8.0 rados/thrash-erasure-code-shec/{ceph.yaml clusters/{fixed-4.yaml openstack.yaml} msgr-failures/osd-delay.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-rados-plugin=shec-k=4-m=3-c=2.yaml} 4
pass 4609406 2019-12-16 22:11:28 2019-12-16 23:58:54 2019-12-17 00:16:51 0:17:57 0:09:33 0:08:24 smithi master centos 8.0 rados/monthrash/{ceph.yaml clusters/3-mons.yaml msgr-failures/mon-delay.yaml msgr/async.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/many.yaml workloads/rados_5925.yaml} 2
pass 4609407 2019-12-16 22:11:30 2019-12-16 23:59:17 2019-12-17 00:17:16 0:17:59 0:08:19 0:09:40 smithi master centos 8.0 rados/multimon/{clusters/3.yaml msgr-failures/many.yaml msgr/async-v1only.yaml no_pools.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/mon_clock_no_skews.yaml} 2
fail 4609408 2019-12-16 22:11:31 2019-12-16 23:59:34 2019-12-17 00:17:32 0:17:58 0:07:53 0:10:05 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-comp.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/cosbench_64K_write.yaml} 1
Failure Reason:

Command failed on smithi184 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609409 2019-12-16 22:11:32 2019-12-16 23:59:51 2019-12-17 00:39:50 0:39:59 0:31:57 0:08:02 smithi master ubuntu 18.04 rados/standalone/{supported-random-distro$/{ubuntu_latest.yaml} workloads/erasure-code.yaml} 1
pass 4609410 2019-12-16 22:11:33 2019-12-17 00:00:40 2019-12-17 00:24:39 0:23:59 0:16:47 0:07:12 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/rados_python.yaml} 2
pass 4609411 2019-12-16 22:11:34 2019-12-17 00:01:19 2019-12-17 00:15:17 0:13:58 0:07:31 0:06:27 smithi master ubuntu 18.04 rados/singleton/{all/erasure-code-nonregression.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
fail 4609412 2019-12-16 22:11:35 2019-12-17 00:02:12 2019-12-17 00:56:14 0:54:02 0:46:27 0:07:35 smithi master ubuntu 18.04 rados/dashboard/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-bitmap.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/dashboard.yaml} 2
Failure Reason:

Test failure: test_create_with_default_expiration_date (tasks.mgr.dashboard.test_user.UserTest)

pass 4609413 2019-12-16 22:11:36 2019-12-17 00:02:12 2019-12-17 00:18:14 0:16:02 0:08:39 0:07:23 smithi master ubuntu 18.04 rados/singleton-nomsgr/{all/ceph-kvstore-tool.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
fail 4609414 2019-12-16 22:11:37 2019-12-17 00:02:12 2019-12-17 00:18:14 0:16:02 0:04:26 0:11:36 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-install/jewel.yaml backoff/normal.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/crush-compat.yaml distro$/{centos_latest.yaml} msgr-failures/fastclose.yaml rados.yaml thrashers/mapgap.yaml thrashosds-health.yaml workloads/rbd_cls.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=jewel

pass 4609415 2019-12-16 22:11:38 2019-12-17 00:02:16 2019-12-17 00:28:15 0:25:59 0:18:04 0:07:55 smithi master ubuntu 18.04 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-async-partial-recovery.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/crush-compat.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/none.yaml thrashosds-health.yaml workloads/cache-pool-snaps.yaml} 2
fail 4609416 2019-12-16 22:11:39 2019-12-17 00:03:39 2019-12-17 00:21:38 0:17:59 0:09:47 0:08:12 smithi master ubuntu 18.04 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-comp.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/orchestrator_cli.yaml} 2
Failure Reason:

Test failure: test_mgr_update (tasks.mgr.test_orchestrator_cli.TestOrchestratorCli)

pass 4609417 2019-12-16 22:11:40 2019-12-17 00:03:39 2019-12-17 00:37:38 0:33:59 0:27:35 0:06:24 smithi master rhel 8.0 rados/singleton/{all/lost-unfound-delete.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
fail 4609418 2019-12-16 22:11:42 2019-12-17 00:03:50 2019-12-17 00:19:49 0:15:59 0:09:31 0:06:28 smithi master rhel 8.0 rados/perf/{ceph.yaml objectstore/bluestore-low-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{rhel_8.yaml} workloads/fio_4K_rand_read.yaml} 1
Failure Reason:

Command failed on smithi201 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609419 2019-12-16 22:11:43 2019-12-17 00:04:27 2019-12-17 00:36:29 0:32:02 0:22:49 0:09:13 smithi master centos 8.0 rados/singleton/{all/lost-unfound.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609420 2019-12-16 22:11:44 2019-12-17 00:04:28 2019-12-17 00:22:27 0:17:59 0:08:14 0:09:45 smithi master centos 8.0 rados/objectstore/{backends/filejournal.yaml supported-random-distro$/{centos_8.yaml}} 1
fail 4609421 2019-12-16 22:11:45 2019-12-17 00:05:05 2019-12-17 00:21:05 0:16:00 0:09:23 0:06:37 smithi master rhel 8.0 rados/singleton-nomsgr/{all/ceph-post-file.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
Failure Reason:

SELinux denials found on ubuntu@smithi089.front.sepia.ceph.com: ['type=AVC msg=audit(1576541432.635:403): avc: denied { lock } for pid=7443 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=61046 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541432.635:402): avc: denied { open } for pid=7443 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=61046 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541432.635:402): avc: denied { read } for pid=7443 comm="setroubleshootd" name="Packages" dev="sda1" ino=61046 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541432.635:404): avc: denied { map } for pid=7443 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=61070 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

fail 4609422 2019-12-16 22:11:46 2019-12-17 00:05:08 2019-12-17 00:35:08 0:30:00 0:22:46 0:07:14 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-partial-recovery.yaml} backoff/peering.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/osd-delay.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/cache-snaps.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi096.front.sepia.ceph.com: ['type=AVC msg=audit(1576541592.669:3872): avc: denied { read } for pid=16184 comm="setroubleshootd" name="Packages" dev="sda1" ino=61046 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.152:3833): avc: denied { write } for pid=16024 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576541592.419:3852): avc: denied { unlink } for pid=16024 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=58126 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.669:3874): avc: denied { map } for pid=16184 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=61070 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.263:3842): avc: denied { open } for pid=16024 comm="rhsmcertd-worke" path="/var/cache/dnf/epel-fafd94c310c51e1e/metalink.xml" dev="sda1" ino=262189 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.669:3872): avc: denied { open } for pid=16184 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=61046 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.148:3831): avc: denied { map } for pid=16024 comm="rhsmcertd-worke" path="/var/lib/rpm/Name" dev="sda1" ino=61070 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.669:3873): avc: denied { lock } for pid=16184 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=61046 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.152:3832): avc: denied { open } for pid=16024 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.152:3833): avc: denied { create } for pid=16024 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.152:3833): avc: denied { open } for pid=16024 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=58126 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.517:3859): avc: denied { open } for pid=16024 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.288:3843): avc: denied { setattr } for pid=16024 comm="rhsmcertd-worke" name="6e2fe611f78ac434c2918bac1eec468dbd24c9b4cdb65bf6a744d10f764f3284-primary.xml.gz" dev="sda1" ino=262159 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.152:3833): avc: denied { add_name } for pid=16024 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576541592.084:3828): avc: denied { read write } for pid=16024 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=61154 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.517:3859): avc: denied { read } for pid=16024 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.419:3852): avc: denied { remove_name } for pid=16024 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=58126 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576541592.084:3830): avc: denied { getattr } for pid=16024 comm="rhsmcertd-worke" path="/var/lib/rpm/Packages" dev="sda1" ino=61046 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.084:3829): avc: denied { lock } for pid=16024 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=61154 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576541592.084:3828): avc: denied { open } for pid=16024 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=61154 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

pass 4609423 2019-12-16 22:11:47 2019-12-17 00:05:18 2019-12-17 00:27:20 0:22:02 0:16:22 0:05:40 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/rados_stress_watch.yaml} 2
fail 4609424 2019-12-16 22:11:48 2019-12-17 00:05:38 2019-12-17 00:21:37 0:15:59 0:07:43 0:08:16 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-stupid.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/fio_4K_rand_rw.yaml} 1
Failure Reason:

Command failed on smithi192 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609425 2019-12-16 22:11:49 2019-12-17 00:06:01 2019-12-17 00:21:59 0:15:58 0:08:09 0:07:49 smithi master centos 8.0 rados/singleton/{all/max-pg-per-osd.from-mon.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609426 2019-12-16 22:11:51 2019-12-17 00:06:29 2019-12-17 00:38:27 0:31:58 0:24:43 0:07:15 smithi master ubuntu 18.04 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/fastclose.yaml objectstore/bluestore-avl.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/fastread.yaml thrashosds-health.yaml workloads/ec-rados-plugin=jerasure-k=3-m=1.yaml} 2
fail 4609427 2019-12-16 22:11:52 2019-12-17 00:06:29 2019-12-17 00:22:27 0:15:58 0:08:43 0:07:15 smithi master rhel 8.0 rados/cephadm/{fixed-2.yaml mode/root.yaml msgr/async.yaml start.yaml supported-random-distro$/{rhel_8.yaml} tasks/rados_python.yaml} 2
Failure Reason:

Command failed on smithi086 with status 1: 'sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid d9dd6646-2062-11ea-8271-001a4aab830c --force'

fail 4609428 2019-12-16 22:11:53 2019-12-17 00:06:29 2019-12-17 00:40:28 0:33:59 0:27:14 0:06:45 smithi master rhel 8.0 rados/singleton-bluestore/{all/cephtool.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
Failure Reason:

Command failed (workunit test cephtool/test.sh) on smithi161 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh'

fail 4609429 2019-12-16 22:11:54 2019-12-17 00:06:33 2019-12-17 00:20:32 0:13:59 0:04:21 0:09:38 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-install/luminous-v1only.yaml backoff/peering.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/off.yaml distro$/{centos_latest.yaml} msgr-failures/few.yaml rados.yaml thrashers/morepggrow.yaml thrashosds-health.yaml workloads/snaps-few-objects.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=luminous

pass 4609430 2019-12-16 22:11:55 2019-12-17 00:06:47 2019-12-17 00:26:45 0:19:58 0:11:43 0:08:15 smithi master centos 8.0 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{default.yaml} backoff/peering_and_degraded.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/cache.yaml} 2
pass 4609431 2019-12-16 22:11:56 2019-12-17 00:06:46 2019-12-17 00:24:45 0:17:59 0:11:20 0:06:39 smithi master ubuntu 18.04 rados/singleton/{all/max-pg-per-osd.from-primary.yaml msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609432 2019-12-16 22:11:57 2019-12-17 00:08:03 2019-12-17 00:40:00 0:31:57 0:23:40 0:08:17 smithi master ubuntu 18.04 rados/thrash-erasure-code-big/{ceph.yaml cluster/{12-osds.yaml openstack.yaml} msgr-failures/osd-delay.yaml objectstore/bluestore-stupid.yaml rados.yaml recovery-overrides/{more-partial-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/fastread.yaml thrashosds-health.yaml workloads/ec-rados-plugin=jerasure-k=4-m=2.yaml} 3
pass 4609433 2019-12-16 22:11:59 2019-12-17 00:08:29 2019-12-17 00:36:26 0:27:57 0:20:12 0:07:45 smithi master rados/verify/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-thrash/default/{default.yaml thrashosds-health.yaml} msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-comp.yaml rados.yaml tasks/rados_api_tests.yaml validater/lockdep.yaml} 2
dead 4609434 2019-12-16 22:12:00 2019-12-17 00:08:33 2019-12-17 12:10:55 12:02:22 smithi master rhel 8.0 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/osd-delay.yaml objectstore/bluestore-avl.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/mapgap.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
pass 4609435 2019-12-16 22:12:01 2019-12-17 00:08:41 2019-12-17 00:34:36 0:25:55 0:15:22 0:10:33 smithi master centos 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-low-osd-mem-target.yaml supported-random-distro$/{centos_8.yaml} tasks/progress.yaml} 2
pass 4609436 2019-12-16 22:12:02 2019-12-17 00:09:31 2019-12-17 00:37:30 0:27:59 0:22:11 0:05:48 smithi master ubuntu 18.04 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/few.yaml rados.yaml recovery-overrides/{more-async-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/ec-small-objects-overwrites.yaml} 2
pass 4609437 2019-12-16 22:12:04 2019-12-17 00:09:31 2019-12-17 00:29:30 0:19:59 0:12:06 0:07:53 smithi master ubuntu 18.04 rados/thrash-erasure-code-shec/{ceph.yaml clusters/{fixed-4.yaml openstack.yaml} msgr-failures/fastclose.yaml objectstore/bluestore-stupid.yaml rados.yaml recovery-overrides/{more-async-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-rados-plugin=shec-k=4-m=3-c=2.yaml} 4
fail 4609438 2019-12-16 22:12:05 2019-12-17 00:09:58 2019-12-17 00:29:56 0:19:58 0:10:09 0:09:49 smithi master centos 8.0 rados/singleton-nomsgr/{all/cephadm.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
Failure Reason:

Command failed (workunit test cephadm/test_cephadm.sh) on smithi101 with status 22: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_cephadm.sh'

pass 4609439 2019-12-16 22:12:06 2019-12-17 00:10:16 2019-12-17 00:44:15 0:33:59 0:26:56 0:07:03 smithi master ubuntu 18.04 rados/monthrash/{ceph.yaml clusters/9-mons.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/one.yaml workloads/rados_api_tests.yaml} 2
pass 4609440 2019-12-16 22:12:07 2019-12-17 00:10:16 2019-12-17 00:26:15 0:15:59 0:09:46 0:06:13 smithi master rhel 8.0 rados/multimon/{clusters/6.yaml msgr-failures/few.yaml msgr/async-v2only.yaml no_pools.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/mon_clock_with_skews.yaml} 2
fail 4609441 2019-12-16 22:12:08 2019-12-17 00:10:17 2019-12-17 00:24:16 0:13:59 0:07:55 0:06:04 smithi master ubuntu 18.04 rados/standalone/{supported-random-distro$/{ubuntu_latest.yaml} workloads/mgr.yaml} 1
Failure Reason:

Command failed (workunit test mgr/balancer.sh) on smithi037 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/mgr/balancer.sh'

fail 4609442 2019-12-16 22:12:10 2019-12-17 00:10:20 2019-12-17 00:24:19 0:13:59 0:07:31 0:06:28 smithi master ubuntu 18.04 rados/perf/{ceph.yaml objectstore/bluestore-basic-min-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{ubuntu_latest.yaml} workloads/fio_4M_rand_read.yaml} 1
Failure Reason:

Command failed on smithi094 with status 1: '/home/ubuntu/cephtest/cbt/cbt.py -a /home/ubuntu/cephtest/archive/cbt /home/ubuntu/cephtest/archive/cbt/cbt_config.yaml'

pass 4609443 2019-12-16 22:12:11 2019-12-17 00:10:23 2019-12-17 00:30:21 0:19:58 0:14:25 0:05:33 smithi master rhel 8.0 rados/singleton/{all/max-pg-per-osd.from-replica.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609444 2019-12-16 22:12:12 2019-12-17 00:10:54 2019-12-17 00:28:53 0:17:59 0:08:43 0:09:16 smithi master centos 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/rados_striper.yaml} 2
fail 4609445 2019-12-16 22:12:13 2019-12-17 00:11:29 2019-12-17 00:31:28 0:19:59 0:13:30 0:06:29 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{default.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/crush-compat.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/dedup_tier.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi085.front.sepia.ceph.com: ['type=AVC msg=audit(1576542459.409:10468): avc: denied { create } for pid=43005 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.409:10468): avc: denied { write } for pid=43005 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576542459.768:10483): avc: denied { open } for pid=43005 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.321:10465): avc: denied { getattr } for pid=43005 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.524:10473): avc: denied { open } for pid=43005 comm="rhsmcertd-worke" path="/var/cache/dnf/ceph-8a6e408a4cef42be/repodata/repomd.xml" dev="sda1" ino=262158 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.978:10496): avc: denied { open } for pid=43322 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542460.232:10505): avc: denied { read } for pid=43395 comm="rpm" name="Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542460.232:10505): avc: denied { open } for pid=43395 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.321:10466): avc: denied { map } for pid=43005 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.321:10464): avc: denied { lock } for pid=43005 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.409:10467): avc: denied { open } for pid=43005 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.409:10468): avc: denied { add_name } for pid=43005 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576542459.978:10498): avc: denied { map } for pid=43322 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=262251 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.768:10483): avc: denied { read } for pid=43005 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.524:10474): avc: denied { setattr } for pid=43005 comm="rhsmcertd-worke" name="de4244cf84e6507539211d9a8fd50f353114f46361f6f5ac6a537392f8d8282d-primary.xml.gz" dev="sda1" ino=262180 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.684:10482): avc: denied { remove_name } for pid=43005 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59699 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576542459.321:10463): avc: denied { read write } for pid=43005 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.409:10468): avc: denied { open } for pid=43005 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=59699 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542460.232:10506): avc: denied { lock } for pid=43395 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.684:10482): avc: denied { unlink } for pid=43005 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59699 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.978:10497): avc: denied { lock } for pid=43322 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.321:10463): avc: denied { open } for pid=43005 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542459.978:10496): avc: denied { read } for pid=43322 comm="setroubleshootd" name="Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542460.232:10507): avc: denied { map } for pid=43395 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=262251 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

pass 4609446 2019-12-16 22:12:14 2019-12-17 00:12:16 2019-12-17 00:30:15 0:17:59 0:12:55 0:05:04 smithi master rhel 8.0 rados/singleton/{all/mon-auth-caps.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609447 2019-12-16 22:12:16 2019-12-17 00:12:16 2019-12-17 01:06:16 0:54:00 0:46:22 0:07:38 smithi master ubuntu 18.04 rados/dashboard/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-comp.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/dashboard.yaml} 2
pass 4609448 2019-12-16 22:12:17 2019-12-17 00:12:16 2019-12-17 02:36:17 2:24:01 2:17:32 0:06:29 smithi master centos 8.0 rados/objectstore/{backends/filestore-idempotent-aio-journal.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609449 2019-12-16 22:12:18 2019-12-17 00:12:21 2019-12-17 00:28:20 0:15:59 0:07:28 0:08:31 smithi master ubuntu 18.04 rados/singleton-nomsgr/{all/export-after-evict.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
fail 4609450 2019-12-16 22:12:20 2019-12-17 00:12:32 2019-12-17 00:30:32 0:18:00 0:04:17 0:13:43 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-install/luminous.yaml backoff/peering_and_degraded.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/crush-compat.yaml distro$/{centos_latest.yaml} msgr-failures/osd-delay.yaml rados.yaml thrashers/none.yaml thrashosds-health.yaml workloads/test_rbd_api.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=luminous

fail 4609451 2019-12-16 22:12:21 2019-12-17 00:13:04 2019-12-17 00:29:03 0:15:59 0:09:49 0:06:10 smithi master rhel 8.0 rados/perf/{ceph.yaml objectstore/bluestore-bitmap.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{rhel_8.yaml} workloads/fio_4M_rand_rw.yaml} 1
Failure Reason:

Command failed on smithi023 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609452 2019-12-16 22:12:22 2019-12-17 00:14:00 2019-12-17 00:33:59 0:19:59 0:10:09 0:09:50 smithi master centos 8.0 rados/singleton/{all/mon-config-key-caps.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609453 2019-12-16 22:12:23 2019-12-17 00:14:00 2019-12-17 00:52:00 0:38:00 0:29:48 0:08:12 smithi master ubuntu 18.04 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-async-partial-recovery.yaml} backoff/peering.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/osd-delay.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/mapgap.yaml thrashosds-health.yaml workloads/pool-snaps-few-objects.yaml} 2
pass 4609454 2019-12-16 22:12:24 2019-12-17 00:15:39 2019-12-17 00:35:37 0:19:58 0:12:41 0:07:17 smithi master rhel 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-stupid.yaml supported-random-distro$/{rhel_8.yaml} tasks/prometheus.yaml} 2
dead 4609455 2019-12-16 22:12:26 2019-12-17 00:15:39 2019-12-17 12:18:01 12:02:22 smithi master centos 8.0 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/few.yaml objectstore/bluestore-bitmap.yaml rados.yaml recovery-overrides/{more-partial-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/minsize_recovery.yaml thrashosds-health.yaml workloads/ec-radosbench.yaml} 2
pass 4609456 2019-12-16 22:12:27 2019-12-17 00:16:24 2019-12-17 00:54:23 0:37:59 0:32:01 0:05:58 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/rados_workunit_loadgen_big.yaml} 2
pass 4609457 2019-12-16 22:12:28 2019-12-17 00:17:03 2019-12-17 00:37:02 0:19:59 0:13:20 0:06:39 smithi master ubuntu 18.04 rados/singleton/{all/mon-config-keys.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609458 2019-12-16 22:12:29 2019-12-17 00:17:18 2019-12-17 00:33:17 0:15:59 0:08:17 0:07:42 smithi master centos 8.0 rados/singleton-nomsgr/{all/full-tiering.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609459 2019-12-16 22:12:30 2019-12-17 00:17:19 2019-12-17 00:49:20 0:32:01 0:22:56 0:09:05 smithi master centos 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-partial-recovery.yaml} backoff/peering_and_degraded.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/rados_api_tests.yaml} 2
fail 4609460 2019-12-16 22:12:32 2019-12-17 00:17:34 2019-12-17 00:31:33 0:13:59 0:07:35 0:06:24 smithi master ubuntu 18.04 rados/perf/{ceph.yaml objectstore/bluestore-comp.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{ubuntu_latest.yaml} workloads/fio_4M_rand_write.yaml} 1
Failure Reason:

Command failed on smithi033 with status 1: '/home/ubuntu/cephtest/cbt/cbt.py -a /home/ubuntu/cephtest/archive/cbt /home/ubuntu/cephtest/archive/cbt/cbt_config.yaml'

pass 4609461 2019-12-16 22:12:33 2019-12-17 00:17:36 2019-12-17 00:35:35 0:17:59 0:09:46 0:08:13 smithi master centos 8.0 rados/singleton/{all/mon-config.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
fail 4609462 2019-12-16 22:12:34 2019-12-17 00:17:38 2019-12-17 00:37:37 0:19:59 0:12:35 0:07:24 smithi master rhel 8.0 rados/thrash-erasure-code-big/{ceph.yaml cluster/{12-osds.yaml openstack.yaml} msgr-failures/fastclose.yaml objectstore/filestore-xfs.yaml rados.yaml recovery-overrides/{more-async-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/mapgap.yaml thrashosds-health.yaml workloads/ec-rados-plugin=lrc-k=4-m=2-l=3.yaml} 3
Failure Reason:

SELinux denials found on ubuntu@smithi140.front.sepia.ceph.com: ['type=AVC msg=audit(1576542792.848:9075): avc: denied { write } for pid=38523 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576542793.191:9079): avc: denied { read } for pid=38523 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.957:9076): avc: denied { open } for pid=38523 comm="rhsmcertd-worke" path="/var/cache/dnf/ceph-8a6e408a4cef42be/repodata/repomd.xml" dev="sda1" ino=262154 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.772:9070): avc: denied { read write } for pid=38523 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.848:9075): avc: denied { open } for pid=38523 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=59783 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.773:9071): avc: denied { lock } for pid=38523 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542793.191:9079): avc: denied { open } for pid=38523 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.848:9075): avc: denied { add_name } for pid=38523 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576542793.109:9078): avc: denied { unlink } for pid=38523 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59783 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542793.325:9082): avc: denied { map } for pid=38577 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=262251 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.773:9073): avc: denied { map } for pid=38523 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.848:9075): avc: denied { create } for pid=38523 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.773:9072): avc: denied { getattr } for pid=38523 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.957:9077): avc: denied { setattr } for pid=38523 comm="rhsmcertd-worke" name="de4244cf84e6507539211d9a8fd50f353114f46361f6f5ac6a537392f8d8282d-primary.xml.gz" dev="sda1" ino=262180 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.848:9074): avc: denied { open } for pid=38523 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542793.324:9080): avc: denied { read } for pid=38577 comm="setroubleshootd" name="Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542793.324:9081): avc: denied { lock } for pid=38577 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542793.109:9078): avc: denied { remove_name } for pid=38523 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=59783 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576542793.324:9080): avc: denied { open } for pid=38577 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576542792.772:9070): avc: denied { open } for pid=38523 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

fail 4609463 2019-12-16 22:12:35 2019-12-17 00:18:39 2019-12-17 00:40:38 0:21:59 0:10:32 0:11:27 smithi master centos rados/verify/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-thrash/none.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml tasks/rados_cls_all.yaml validater/valgrind.yaml} 2
Failure Reason:

Command failed on smithi116 with status 1: 'sudo yum -y install ceph'

pass 4609464 2019-12-16 22:12:36 2019-12-17 00:18:39 2019-12-17 00:48:39 0:30:00 0:23:33 0:06:27 smithi master ubuntu 18.04 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/fastclose.yaml objectstore/bluestore-bitmap.yaml rados.yaml recovery-overrides/{more-async-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
dead 4609465 2019-12-16 22:12:37 2019-12-17 00:19:23 2019-12-17 12:21:50 12:02:27 smithi master rhel 8.0 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/osd-delay.yaml rados.yaml recovery-overrides/{more-partial-recovery.yaml} supported-random-distro$/{rhel_8.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/ec-snaps-few-objects-overwrites.yaml} 2
pass 4609466 2019-12-16 22:12:38 2019-12-17 00:20:11 2019-12-17 00:38:10 0:17:59 0:11:07 0:06:52 smithi master ubuntu 18.04 rados/thrash-erasure-code-shec/{ceph.yaml clusters/{fixed-4.yaml openstack.yaml} msgr-failures/few.yaml objectstore/filestore-xfs.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-rados-plugin=shec-k=4-m=3-c=2.yaml} 4
pass 4609467 2019-12-16 22:12:39 2019-12-17 00:20:33 2019-12-17 01:08:33 0:48:00 0:39:45 0:08:15 smithi master centos 8.0 rados/monthrash/{ceph.yaml clusters/3-mons.yaml msgr-failures/mon-delay.yaml msgr/async-v2only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/sync-many.yaml workloads/rados_mon_osdmap_prune.yaml} 2
fail 4609468 2019-12-16 22:12:40 2019-12-17 00:20:53 2019-12-17 00:34:52 0:13:59 0:04:12 0:09:47 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-install/mimic-v1only.yaml backoff/normal.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/off.yaml distro$/{centos_latest.yaml} msgr-failures/fastclose.yaml rados.yaml thrashers/pggrow.yaml thrashosds-health.yaml workloads/cache-snaps.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=mimic

pass 4609469 2019-12-16 22:12:42 2019-12-17 00:21:08 2019-12-17 00:45:06 0:23:58 0:14:57 0:09:01 smithi master centos 8.0 rados/multimon/{clusters/9.yaml msgr-failures/many.yaml msgr/async.yaml no_pools.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/mon_recovery.yaml} 3
fail 4609470 2019-12-16 22:12:43 2019-12-17 00:21:36 2019-12-17 00:53:34 0:31:58 0:25:07 0:06:51 smithi master rhel 8.0 rados/standalone/{supported-random-distro$/{rhel_8.yaml} workloads/misc.yaml} 1
Failure Reason:

SELinux denials found on ubuntu@smithi091.front.sepia.ceph.com: ['type=AVC msg=audit(1576543417.471:5683): avc: denied { open } for pid=41566 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.766:5686): avc: denied { map } for pid=41749 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=262245 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.097:5679): avc: denied { open } for pid=41566 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=55621 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.022:5674): avc: denied { read write } for pid=41566 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=262264 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.766:5684): avc: denied { read } for pid=41749 comm="setroubleshootd" name="Packages" dev="sda1" ino=262244 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.097:5679): avc: denied { add_name } for pid=41566 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576543417.385:5682): avc: denied { unlink } for pid=41566 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=55621 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.471:5683): avc: denied { read } for pid=41566 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.022:5677): avc: denied { map } for pid=41566 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262265 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.022:5675): avc: denied { lock } for pid=41566 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262264 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.220:5681): avc: denied { setattr } for pid=41566 comm="rhsmcertd-worke" name="de4244cf84e6507539211d9a8fd50f353114f46361f6f5ac6a537392f8d8282d-primary.xml.gz" dev="sda1" ino=262174 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.097:5679): avc: denied { create } for pid=41566 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.022:5674): avc: denied { open } for pid=41566 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262264 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.097:5678): avc: denied { open } for pid=41566 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.220:5680): avc: denied { open } for pid=41566 comm="rhsmcertd-worke" path="/var/cache/dnf/ceph-8a6e408a4cef42be/repodata/repomd.xml" dev="sda1" ino=262169 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.766:5684): avc: denied { open } for pid=41749 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262244 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.385:5682): avc: denied { remove_name } for pid=41566 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=55621 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576543417.022:5676): avc: denied { getattr } for pid=41566 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262265 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576543417.097:5679): avc: denied { write } for pid=41566 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576543417.766:5685): avc: denied { lock } for pid=41749 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262244 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

pass 4609471 2019-12-16 22:12:44 2019-12-17 00:21:39 2019-12-17 00:43:38 0:21:59 0:16:11 0:05:48 smithi master rhel 8.0 rados/singleton/{all/osd-backfill.yaml msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609472 2019-12-16 22:12:45 2019-12-17 00:21:42 2019-12-17 02:43:43 2:22:01 2:17:12 0:04:49 smithi master rhel 8.0 rados/objectstore/{backends/filestore-idempotent.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609473 2019-12-16 22:12:46 2019-12-17 00:22:01 2019-12-17 00:42:00 0:19:59 0:11:24 0:08:35 smithi master rhel 8.0 rados/singleton-nomsgr/{all/health-warnings.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609474 2019-12-16 22:12:47 2019-12-17 00:22:29 2019-12-17 01:28:29 1:06:00 0:57:02 0:08:58 smithi master centos 8.0 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-active-recovery.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/crush-compat.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml} thrashers/none.yaml thrashosds-health.yaml workloads/radosbench.yaml} 2
pass 4609475 2019-12-16 22:12:48 2019-12-17 00:22:29 2019-12-17 00:50:28 0:27:59 0:20:02 0:07:57 smithi master ubuntu 18.04 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/rados_workunit_loadgen_mix.yaml} 2
fail 4609476 2019-12-16 22:12:49 2019-12-17 00:24:29 2019-12-17 00:40:28 0:15:59 0:09:05 0:06:54 smithi master rhel 8.0 rados/perf/{ceph.yaml objectstore/bluestore-low-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{rhel_8.yaml} workloads/radosbench_4K_rand_read.yaml} 1
Failure Reason:

Command failed on smithi075 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609477 2019-12-16 22:12:50 2019-12-17 00:24:29 2019-12-17 00:42:29 0:18:00 0:10:21 0:07:39 smithi master rhel 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/filestore-xfs.yaml supported-random-distro$/{rhel_8.yaml} tasks/workunits.yaml} 2
pass 4609478 2019-12-16 22:12:51 2019-12-17 00:24:29 2019-12-17 00:46:29 0:22:00 0:14:56 0:07:04 smithi master centos 8.0 rados/singleton/{all/osd-recovery-incomplete.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609479 2019-12-16 22:12:52 2019-12-17 00:24:40 2019-12-17 00:48:39 0:23:59 0:15:50 0:08:09 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-active-recovery.yaml} backoff/peering.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/osd-delay.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/redirect.yaml} 2
pass 4609480 2019-12-16 22:12:53 2019-12-17 00:24:46 2019-12-17 00:56:45 0:31:59 0:25:40 0:06:19 smithi master ubuntu 18.04 rados/singleton/{all/osd-recovery.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609481 2019-12-16 22:12:54 2019-12-17 00:25:52 2019-12-17 00:55:51 0:29:59 0:21:30 0:08:29 smithi master centos 8.0 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/osd-delay.yaml objectstore/bluestore-comp.yaml rados.yaml recovery-overrides/{more-async-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/ec-small-objects-fast-read.yaml} 2
fail 4609482 2019-12-16 22:12:55 2019-12-17 00:26:16 2019-12-17 00:42:15 0:15:59 0:06:53 0:09:06 smithi master centos 8.0 rados/cephadm/{fixed-2.yaml mode/root.yaml msgr/async-v1only.yaml start.yaml supported-random-distro$/{centos_8.yaml} tasks/rados_api_tests.yaml} 2
Failure Reason:

Command failed on smithi168 with status 1: 'sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid bc57179a-2065-11ea-8271-001a4aab830c --force'

fail 4609483 2019-12-16 22:12:56 2019-12-17 00:26:26 2019-12-17 01:12:25 0:45:59 0:36:58 0:09:01 smithi master centos 8.0 rados/dashboard/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/bluestore-low-osd-mem-target.yaml supported-random-distro$/{centos_8.yaml} tasks/dashboard.yaml} 2
Failure Reason:

Test failure: test_all (tasks.mgr.dashboard.test_rgw.RgwBucketTest)

fail 4609484 2019-12-16 22:12:57 2019-12-17 00:26:46 2019-12-17 01:12:46 0:46:00 0:39:44 0:06:16 smithi master rhel 8.0 rados/singleton-bluestore/{all/cephtool.yaml msgr-failures/many.yaml msgr/async-v1only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
Failure Reason:

Command failed (workunit test cephtool/test.sh) on smithi039 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh'

fail 4609485 2019-12-16 22:12:58 2019-12-17 00:26:46 2019-12-17 00:42:45 0:15:59 0:09:56 0:06:03 smithi master rhel 8.0 rados/singleton-nomsgr/{all/large-omap-object-warnings.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
Failure Reason:

Command failed (workunit test rados/test_large_omap_detection.py) on smithi138 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test_large_omap_detection.py'

fail 4609486 2019-12-16 22:12:59 2019-12-17 00:27:42 2019-12-17 01:43:43 1:16:01 1:08:26 0:07:35 smithi master ubuntu 18.04 rados/upgrade/nautilus-x-singleton/{0-cluster/{openstack.yaml start.yaml} 1-install/nautilus.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 5-workload/{radosbench.yaml rbd_api.yaml} 6-finish-upgrade.yaml 7-octopus.yaml 8-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} bluestore-bitmap.yaml supported-random-distro$/{centos_8.yaml} thrashosds-health.yaml ubuntu_latest.yaml} 4
Failure Reason:

Command failed (workunit test rbd/test_librbd_python.sh) on smithi113 with status 127: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rbd/test_librbd_python.sh'

fail 4609487 2019-12-16 22:13:00 2019-12-17 00:27:39 2019-12-17 00:41:38 0:13:59 0:04:11 0:09:48 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-install/mimic.yaml backoff/peering.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/crush-compat.yaml distro$/{centos_latest.yaml} msgr-failures/few.yaml rados.yaml thrashers/careful.yaml thrashosds-health.yaml workloads/radosbench.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=mimic

fail 4609488 2019-12-16 22:13:01 2019-12-17 00:28:16 2019-12-17 00:44:15 0:15:59 0:07:53 0:08:06 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-stupid.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/radosbench_4K_seq_read.yaml} 1
Failure Reason:

Command failed on smithi101 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

pass 4609489 2019-12-16 22:13:02 2019-12-17 00:28:17 2019-12-17 00:42:15 0:13:58 0:07:40 0:06:18 smithi master ubuntu 18.04 rados/singleton/{all/peer.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
pass 4609490 2019-12-16 22:13:03 2019-12-17 00:28:22 2019-12-17 01:00:21 0:31:59 0:24:51 0:07:08 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/rados_workunit_loadgen_mostlyread.yaml} 2
pass 4609491 2019-12-16 22:13:05 2019-12-17 00:29:10 2019-12-17 00:49:09 0:19:59 0:13:00 0:06:59 smithi master ubuntu 18.04 rados/thrash/{0-size-min-size-overrides/2-size-2-min-size.yaml 1-pg-log-overrides/normal_pg_log.yaml 2-recovery-overrides/{more-active-recovery.yaml} backoff/peering_and_degraded.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/upmap.yaml msgr-failures/fastclose.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/redirect_promote_tests.yaml} 2
pass 4609492 2019-12-16 22:13:06 2019-12-17 00:29:10 2019-12-17 00:51:09 0:21:59 0:15:16 0:06:43 smithi master rhel 8.0 rados/singleton/{all/pg-autoscaler.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 2
pass 4609493 2019-12-16 22:13:07 2019-12-17 00:29:10 2019-12-17 01:01:09 0:31:59 0:25:33 0:06:26 smithi master ubuntu 18.04 rados/thrash-erasure-code-big/{ceph.yaml cluster/{12-osds.yaml openstack.yaml} msgr-failures/few.yaml objectstore/bluestore-avl.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/morepggrow.yaml thrashosds-health.yaml workloads/ec-rados-plugin=jerasure-k=4-m=2.yaml} 3
pass 4609494 2019-12-16 22:13:08 2019-12-17 00:29:31 2019-12-17 00:47:30 0:17:59 0:10:47 0:07:12 smithi master rados/verify/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-thrash/default/{default.yaml thrashosds-health.yaml} msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-stupid.yaml rados.yaml tasks/mon_recovery.yaml validater/lockdep.yaml} 2
dead 4609495 2019-12-16 22:13:09 2019-12-17 00:29:38 2019-12-17 12:32:01 12:02:23 smithi master centos 8.0 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/few.yaml objectstore/bluestore-comp.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{centos_8.yaml} thrashers/none.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
pass 4609496 2019-12-16 22:13:10 2019-12-17 00:29:58 2019-12-17 00:49:57 0:19:59 0:08:33 0:11:26 smithi master centos 8.0 rados/mgr/{clusters/{2-node-mgr.yaml} debug/mgr.yaml objectstore/filestore-xfs.yaml supported-random-distro$/{centos_8.yaml} tasks/cephadm_orchestrator.yaml} 2
fail 4609497 2019-12-16 22:13:11 2019-12-17 00:30:35 2019-12-17 12:30:47 12:00:12 11:52:57 0:07:15 smithi master ubuntu 18.04 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/fastclose.yaml rados.yaml recovery-overrides/{more-async-partial-recovery.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-snaps-few-objects-overwrites.yaml} 2
Failure Reason:

Command failed on smithi085 with status 124: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json'

pass 4609498 2019-12-16 22:13:12 2019-12-17 00:30:35 2019-12-17 00:52:34 0:21:59 0:11:40 0:10:19 smithi master ubuntu 18.04 rados/thrash-erasure-code-shec/{ceph.yaml clusters/{fixed-4.yaml openstack.yaml} msgr-failures/few.yaml objectstore/bluestore-avl.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{ubuntu_latest.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-rados-plugin=shec-k=4-m=3-c=2.yaml} 4
pass 4609499 2019-12-16 22:13:13 2019-12-17 00:30:35 2019-12-17 00:46:34 0:15:59 0:09:51 0:06:08 smithi master rhel 8.0 rados/objectstore/{backends/fusestore.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609500 2019-12-16 22:13:14 2019-12-17 00:31:33 2019-12-17 00:47:32 0:15:59 0:08:55 0:07:04 smithi master centos 8.0 rados/singleton-nomsgr/{all/lazy_omap_stats_output.yaml rados.yaml supported-random-distro$/{centos_8.yaml}} 1
pass 4609501 2019-12-16 22:13:15 2019-12-17 00:31:54 2019-12-17 01:05:53 0:33:59 0:25:42 0:08:17 smithi master ubuntu 18.04 rados/monthrash/{ceph.yaml clusters/9-mons.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml} thrashers/sync.yaml workloads/rados_mon_workunits.yaml} 2
pass 4609502 2019-12-16 22:13:17 2019-12-17 00:33:26 2019-12-17 00:49:25 0:15:59 0:08:27 0:07:32 smithi master centos 8.0 rados/multimon/{clusters/21.yaml msgr-failures/few.yaml msgr/async-v1only.yaml no_pools.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{centos_8.yaml} tasks/mon_clock_no_skews.yaml} 3
fail 4609503 2019-12-16 22:13:18 2019-12-17 00:33:56 2019-12-17 00:49:55 0:15:59 0:08:09 0:07:50 smithi master centos 8.0 rados/perf/{ceph.yaml objectstore/bluestore-basic-min-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{centos_8.yaml} workloads/radosbench_4M_rand_read.yaml} 1
Failure Reason:

Command failed on smithi025 with status 1: 'sudo yum -y install python3-yaml python3-lxml librbd-devel pdsh collectl'

fail 4609504 2019-12-16 22:13:19 2019-12-17 00:34:01 2019-12-17 01:16:00 0:41:59 0:34:28 0:07:31 smithi master ubuntu 18.04 rados/standalone/{supported-random-distro$/{ubuntu_latest.yaml} workloads/mon.yaml} 1
Failure Reason:

Command failed (workunit test mon/osd-pool-create.sh) on smithi172 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=8a58d0694cce968d9a21937d52c2e48cfe306455 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/osd-pool-create.sh'

pass 4609505 2019-12-16 22:13:20 2019-12-17 00:34:57 2019-12-17 00:50:56 0:15:59 0:09:46 0:06:13 smithi master rhel 8.0 rados/singleton/{all/pg-removal-interruption.yaml msgr-failures/few.yaml msgr/async-v1only.yaml objectstore/bluestore-avl.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
pass 4609506 2019-12-16 22:13:21 2019-12-17 00:34:57 2019-12-17 00:54:56 0:19:59 0:14:23 0:05:36 smithi master rhel 8.0 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{more-active-recovery.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/crush-compat.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/redirect_set_object.yaml} 2
fail 4609507 2019-12-16 22:13:22 2019-12-17 00:35:09 2019-12-17 00:49:08 0:13:59 0:04:15 0:09:44 smithi master centos 8.0 rados/thrash-old-clients/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-install/nautilus-v1only.yaml backoff/peering_and_degraded.yaml ceph.yaml clusters/{openstack.yaml three-plus-one.yaml} d-balancer/off.yaml distro$/{centos_latest.yaml} msgr-failures/osd-delay.yaml rados.yaml thrashers/default.yaml thrashosds-health.yaml workloads/rbd_cls.yaml} 4
Failure Reason:

Failed to fetch package version from https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=default&distros=centos%2F8%2Fx86_64&ref=nautilus

pass 4609508 2019-12-16 22:13:23 2019-12-17 00:35:37 2019-12-17 00:55:36 0:19:59 0:13:13 0:06:46 smithi master ubuntu 18.04 rados/singleton/{all/radostool.yaml msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 1
fail 4609509 2019-12-16 22:13:24 2019-12-17 00:35:38 2019-12-17 00:57:37 0:21:59 0:14:24 0:07:35 smithi master rhel 8.0 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async-v2only.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_8.yaml} tasks/readwrite.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi053.front.sepia.ceph.com: ['type=AVC msg=audit(1576544132.090:6872): avc: denied { open } for pid=31709 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.487:6867): avc: denied { create } for pid=31650 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.405:6863): avc: denied { lock } for pid=31650 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.943:6870): avc: denied { remove_name } for pid=31650 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=56723 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576544131.405:6865): avc: denied { map } for pid=31650 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.620:6869): avc: denied { setattr } for pid=31650 comm="rhsmcertd-worke" name="6e2fe611f78ac434c2918bac1eec468dbd24c9b4cdb65bf6a744d10f764f3284-primary.xml.gz" dev="sda1" ino=264485 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544132.028:6871): avc: denied { open } for pid=31650 comm="rhsmcertd-worke" path="/etc/dnf/modules.d/satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.487:6867): avc: denied { write } for pid=31650 comm="rhsmcertd-worke" name="dnf" dev="sda1" ino=60792 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576544131.487:6867): avc: denied { open } for pid=31650 comm="rhsmcertd-worke" path="/var/cache/dnf/metadata_lock.pid" dev="sda1" ino=56723 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544132.090:6872): avc: denied { read } for pid=31709 comm="setroubleshootd" name="Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.593:6868): avc: denied { open } for pid=31650 comm="rhsmcertd-worke" path="/var/cache/dnf/ceph-8a6e408a4cef42be/repodata/repomd.xml" dev="sda1" ino=262154 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.487:6866): avc: denied { open } for pid=31650 comm="rhsmcertd-worke" path="/var/log/hawkey.log" dev="sda1" ino=60817 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:var_log_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.943:6870): avc: denied { unlink } for pid=31650 comm="rhsmcertd-worke" name="metadata_lock.pid" dev="sda1" ino=56723 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.405:6862): avc: denied { open } for pid=31650 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544132.091:6874): avc: denied { map } for pid=31709 comm="setroubleshootd" path="/var/lib/rpm/Name" dev="sda1" ino=262251 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544132.090:6873): avc: denied { lock } for pid=31709 comm="setroubleshootd" path="/var/lib/rpm/Packages" dev="sda1" ino=262250 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.405:6862): avc: denied { read write } for pid=31650 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=262270 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.487:6867): avc: denied { add_name } for pid=31650 comm="rhsmcertd-worke" name="metadata_lock.pid" scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:rpm_var_cache_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1576544132.028:6871): avc: denied { read } for pid=31650 comm="rhsmcertd-worke" name="satellite-5-client.module" dev="sda1" ino=57237 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=system_u:object_r:root_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1576544131.405:6864): avc: denied { getattr } for pid=31650 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=262271 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1']

pass 4609510 2019-12-16 22:13:25 2019-12-17 00:35:51 2019-12-17 00:53:50 0:17:59 0:11:55 0:06:04 smithi master rhel 8.0 rados/singleton-nomsgr/{all/librados_hello_world.yaml rados.yaml supported-random-distro$/{rhel_8.yaml}} 1
fail 4609511 2019-12-16 22:13:26 2019-12-17 00:36:47 2019-12-17 00:52:46 0:15:59 0:07:24 0:08:35 smithi master ubuntu 18.04 rados/perf/{ceph.yaml objectstore/bluestore-bitmap.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{ubuntu_latest.yaml} workloads/radosbench_4M_seq_read.yaml} 1
Failure Reason:

Command failed on smithi061 with status 1: '/home/ubuntu/cephtest/cbt/cbt.py -a /home/ubuntu/cephtest/archive/cbt /home/ubuntu/cephtest/archive/cbt/cbt_config.yaml'