Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
fail 4242684 2019-08-23 03:12:24 2019-08-23 03:15:10 2019-08-23 03:31:09 0:15:59 0:07:59 0:08:00 smithi master ubuntu 18.04 rados/mgr/{clusters/{2-node-mgr.yaml openstack.yaml} debug/mgr.yaml objectstore/bluestore-low-osd-mem-target.yaml supported-random-distro$/{ubuntu_latest.yaml} tasks/orchestrator_cli.yaml} 2
Failure Reason:

Test failure: test_load_data (tasks.mgr.test_orchestrator_cli.TestOrchestratorCli)

fail 4242685 2019-08-23 03:12:24 2019-08-23 03:15:15 2019-08-23 07:53:19 4:38:04 3:15:20 1:22:44 smithi master rhel 7.6 rados/upgrade/mimic-x-singleton/{0-cluster/{openstack.yaml start.yaml} 1-install/mimic.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{rbd-cls.yaml rbd-import-export.yaml readwrite.yaml snaps-few-objects.yaml} 5-workload/{radosbench.yaml rbd_api.yaml} 6-finish-upgrade.yaml 7-nautilus.yaml 8-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} bluestore-bitmap.yaml supported-random-distro$/{rhel_7.yaml} thrashosds-health.yaml} 4
Failure Reason:

SELinux denials found on ubuntu@smithi150.front.sepia.ceph.com: ['type=AVC msg=audit(1566545803.697:7335): avc: denied { open } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566536798.824:6562): avc: denied { open } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566536798.824:6563): avc: denied { ioctl } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566538598.910:6631): avc: denied { ioctl } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566538598.910:6630): avc: denied { read } for pid=8581 comm="smartd" name="nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566540404.059:6706): avc: denied { open } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566540404.059:6707): avc: denied { ioctl } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566542203.201:7132): avc: denied { open } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566536798.824:6562): avc: denied { read } for pid=8581 comm="smartd" name="nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566545803.697:7336): avc: denied { ioctl } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566542203.201:7133): avc: denied { ioctl } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566538598.910:6630): avc: denied { open } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566544003.371:7274): avc: denied { open } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566545803.697:7335): avc: denied { read } for pid=8581 comm="smartd" name="nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566542203.201:7132): avc: denied { read } for pid=8581 comm="smartd" name="nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566544003.371:7275): avc: denied { ioctl } for pid=8581 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=685 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566540404.059:6706): avc: denied { read } for pid=8581 comm="smartd" name="nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566544003.371:7274): avc: denied { read } for pid=8581 comm="smartd" name="nvme0" dev="devtmpfs" ino=685 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1']

fail 4242686 2019-08-23 03:12:25 2019-08-23 03:15:26 2019-08-23 04:23:26 1:08:00 1:00:22 0:07:38 smithi master rhel 7.6 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/few.yaml objectstore/bluestore-stupid.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{rhel_7.yaml} thrashers/minsize_recovery.yaml thrashosds-health.yaml workloads/ec-radosbench.yaml} 2
Failure Reason:

not all PGs are active or peered 15 seconds after marking out OSDs

fail 4242687 2019-08-23 03:12:26 2019-08-23 03:17:28 2019-08-23 03:35:27 0:17:59 0:10:06 0:07:53 smithi master ubuntu 18.04 rados/singleton/{all/pg-autoscaler.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-low-osd-mem-target.yaml rados.yaml supported-random-distro$/{ubuntu_latest.yaml}} 2
Failure Reason:

"2019-08-23T03:33:12.045418+0000 mgr.x (mgr.4100) 118 : cluster [ERR] Unhandled exception from module 'pg_autoscaler' while running on mgr.x: (1,)" in cluster log

pass 4242688 2019-08-23 03:12:27 2019-08-23 03:17:28 2019-08-23 03:53:28 0:36:00 0:27:44 0:08:16 smithi master rhel 7.6 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/osd-delay.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{rhel_7.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/ec-small-objects-overwrites.yaml} 2
fail 4242689 2019-08-23 03:12:28 2019-08-23 03:17:45 2019-08-23 03:47:44 0:29:59 0:23:57 0:06:02 smithi master rhel 7.6 rados/perf/{ceph.yaml objectstore/bluestore-low-osd-mem-target.yaml openstack.yaml settings/optimized.yaml supported-random-distro$/{rhel_7.yaml} workloads/cosbench_64K_write.yaml} 1
Failure Reason:

Command failed on smithi137 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest'

fail 4242690 2019-08-23 03:12:29 2019-08-23 03:19:28 2019-08-23 03:55:27 0:35:59 0:28:32 0:07:27 smithi master rhel 7.6 rados/mgr/{clusters/{2-node-mgr.yaml openstack.yaml} debug/mgr.yaml objectstore/bluestore-low-osd-mem-target.yaml supported-random-distro$/{rhel_7.yaml} tasks/dashboard.yaml} 2
Failure Reason:

Test failure: test_create_get_update_delete_w_tenant (tasks.mgr.dashboard.test_rgw.RgwBucketTest)

fail 4242691 2019-08-23 03:12:30 2019-08-23 03:19:28 2019-08-23 04:03:27 0:43:59 0:33:04 0:10:55 smithi master rhel 7.6 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/fast.yaml msgr-failures/osd-delay.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{rhel_7.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-pool-snaps-few-objects-overwrites.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi100.front.sepia.ceph.com: ['type=AVC msg=audit(1566532670.255:34937): avc: denied { open } for pid=8542 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=15130 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532670.255:34937): avc: denied { read } for pid=8542 comm="smartd" name="nvme0" dev="devtmpfs" ino=15130 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532670.255:34938): avc: denied { ioctl } for pid=8542 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=15130 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1']

pass 4242692 2019-08-23 03:12:31 2019-08-23 03:19:29 2019-08-23 04:01:28 0:41:59 0:26:35 0:15:24 smithi master rhel 7.6 rados/thrash/{0-size-min-size-overrides/3-size-2-min-size.yaml 1-pg-log-overrides/short_pg_log.yaml 2-recovery-overrides/{default.yaml} backoff/normal.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} d-balancer/off.yaml msgr-failures/fastclose.yaml msgr/async-v2only.yaml objectstore/bluestore-bitmap.yaml rados.yaml supported-random-distro$/{rhel_7.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/small-objects.yaml} 2
pass 4242693 2019-08-23 03:12:32 2019-08-23 03:19:40 2019-08-23 03:55:39 0:35:59 0:26:23 0:09:36 smithi master rhel 7.6 rados/thrash-erasure-code/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/few.yaml objectstore/bluestore-comp.yaml rados.yaml recovery-overrides/{default.yaml} supported-random-distro$/{rhel_7.yaml} thrashers/careful.yaml thrashosds-health.yaml workloads/ec-small-objects.yaml} 2
fail 4242694 2019-08-23 03:12:32 2019-08-23 03:19:58 2019-08-23 04:03:57 0:43:59 0:29:18 0:14:41 smithi master rhel 7.6 rados/basic/{ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/many.yaml msgr/async.yaml objectstore/bluestore-stupid.yaml rados.yaml supported-random-distro$/{rhel_7.yaml} tasks/rados_workunit_loadgen_mostlyread.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi201.front.sepia.ceph.com: ['type=AVC msg=audit(1566532936.216:8031): avc: denied { read } for pid=8586 comm="smartd" name="nvme0" dev="devtmpfs" ino=15485 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532936.216:8032): avc: denied { ioctl } for pid=8586 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=15485 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532936.216:8031): avc: denied { open } for pid=8586 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=15485 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1']

pass 4242695 2019-08-23 03:12:33 2019-08-23 03:21:14 2019-08-23 03:57:13 0:35:59 0:27:01 0:08:58 smithi master rhel 7.6 rados/thrash-erasure-code-overwrites/{bluestore-bitmap.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} fast/normal.yaml msgr-failures/fastclose.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{rhel_7.yaml} thrashers/default.yaml thrashosds-health.yaml workloads/ec-small-objects-fast-read-overwrites.yaml} 2
fail 4242696 2019-08-23 03:12:34 2019-08-23 03:21:15 2019-08-23 04:09:14 0:47:59 0:31:02 0:16:57 smithi master rhel 7.6 rados/thrash-erasure-code-isa/{arch/x86_64.yaml ceph.yaml clusters/{fixed-2.yaml openstack.yaml} msgr-failures/fastclose.yaml objectstore/filestore-xfs.yaml rados.yaml recovery-overrides/{more-active-recovery.yaml} supported-random-distro$/{rhel_7.yaml} thrashers/pggrow.yaml thrashosds-health.yaml workloads/ec-rados-plugin=isa-k=2-m=1.yaml} 2
Failure Reason:

SELinux denials found on ubuntu@smithi019.front.sepia.ceph.com: ['type=AVC msg=audit(1566533117.289:32699): avc: denied { ioctl } for pid=8348 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=15635 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566533117.288:32698): avc: denied { read } for pid=8348 comm="smartd" name="nvme0" dev="devtmpfs" ino=15635 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566533117.288:32698): avc: denied { open } for pid=8348 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=15635 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1']

pass 4242697 2019-08-23 03:12:35 2019-08-23 03:23:25 2019-08-23 03:59:24 0:35:59 0:23:35 0:12:24 smithi master rhel 7.6 rados/monthrash/{ceph.yaml clusters/9-mons.yaml msgr-failures/few.yaml msgr/async-v2only.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_7.yaml} thrashers/sync.yaml workloads/pool-create-delete.yaml} 2
fail 4242698 2019-08-23 03:12:36 2019-08-23 03:23:25 2019-08-23 03:51:24 0:27:59 0:14:27 0:13:32 smithi master rhel 7.6 rados/mgr/{clusters/{2-node-mgr.yaml openstack.yaml} debug/mgr.yaml objectstore/bluestore-comp.yaml supported-random-distro$/{rhel_7.yaml} tasks/orchestrator_cli.yaml} 2
Failure Reason:

Test failure: test_load_data (tasks.mgr.test_orchestrator_cli.TestOrchestratorCli)

fail 4242699 2019-08-23 03:12:37 2019-08-23 03:23:25 2019-08-23 04:01:24 0:37:59 0:31:52 0:06:07 smithi master rhel 7.6 rados/singleton/{all/ec-lost-unfound.yaml msgr-failures/many.yaml msgr/async.yaml objectstore/bluestore-comp.yaml rados.yaml supported-random-distro$/{rhel_7.yaml}} 1
Failure Reason:

SELinux denials found on ubuntu@smithi063.front.sepia.ceph.com: ['type=AVC msg=audit(1566532714.500:10948): avc: denied { read } for pid=8569 comm="smartd" name="nvme0" dev="devtmpfs" ino=16457 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532714.500:10949): avc: denied { ioctl } for pid=8569 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=16457 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532714.500:10948): avc: denied { open } for pid=8569 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=16457 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1']

fail 4242700 2019-08-23 03:12:38 2019-08-23 03:23:25 2019-08-23 04:03:24 0:39:59 0:30:26 0:09:33 smithi master rhel 7.6 rados/singleton/{all/lost-unfound.yaml msgr-failures/few.yaml msgr/async.yaml objectstore/filestore-xfs.yaml rados.yaml supported-random-distro$/{rhel_7.yaml}} 1
Failure Reason:

SELinux denials found on ubuntu@smithi008.front.sepia.ceph.com: ['type=AVC msg=audit(1566532816.520:10448): avc: denied { read } for pid=8333 comm="smartd" name="nvme0" dev="devtmpfs" ino=17430 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532816.520:10448): avc: denied { open } for pid=8333 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=17430 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566532816.521:10449): avc: denied { ioctl } for pid=8333 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=17430 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1']