User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
yuriw | 2019-08-23 15:21:31 | 2019-08-23 17:52:48 | 2019-08-23 23:50:54 | 5:58:06 | fs | wip_nautilus_14.2.3_RC1 | smithi | e157074 | 26 | 18 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 4245653 | 2019-08-23 15:21:36 | 2019-08-23 17:45:05 | 2019-08-23 18:11:05 | 0:26:00 | 0:19:36 | 0:06:24 | smithi | master | rhel | 7.5 | fs/32bits/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{faked-ino.yaml frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi058.front.sepia.ceph.com: ['type=AVC msg=audit(1566582767.260:3086): avc: denied { open } for pid=21898 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10612 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566582767.260:3087): avc: denied { ioctl } for pid=21898 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10612 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566582767.260:3086): avc: denied { read } for pid=21898 comm="smartd" name="nvme0" dev="devtmpfs" ino=10612 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
fail | 4245654 | 2019-08-23 15:21:37 | 2019-08-23 17:45:07 | 2019-08-23 20:07:08 | 2:22:01 | 0:12:15 | 2:09:46 | smithi | master | ubuntu | 18.04 | fs/multiclient/{begin.yaml clusters/1-mds-2-client.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} distros/ubuntu_latest.yaml mount/fuse.yaml objectstore-ec/bluestore-ec-root.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} tasks/cephfs_misc_tests.yaml} | 4 | |
Failure Reason:
Test failure: test_drop_cache_command_dead (tasks.cephfs.test_misc.TestCacheDrop) |
||||||||||||||
fail | 4245655 | 2019-08-23 15:21:38 | 2019-08-23 17:45:07 | 2019-08-23 18:41:07 | 0:56:00 | 0:47:40 | 0:08:20 | smithi | master | rhel | 7.5 | fs/multifs/{begin.yaml clusters/1a3s-mds-2c-client.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/failover.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi164.front.sepia.ceph.com: ['type=AVC msg=audit(1566582879.877:3057): avc: denied { read } for pid=21929 comm="smartd" name="nvme0" dev="devtmpfs" ino=1503 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566582879.877:3058): avc: denied { ioctl } for pid=21929 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1503 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566584680.111:15801): avc: denied { read } for pid=21929 comm="smartd" name="nvme0" dev="devtmpfs" ino=1503 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566582879.877:3057): avc: denied { open } for pid=21929 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1503 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566584680.111:15802): avc: denied { ioctl } for pid=21929 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1503 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566584680.111:15801): avc: denied { open } for pid=21929 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1503 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
fail | 4245656 | 2019-08-23 15:21:38 | 2019-08-23 17:47:18 | 2019-08-23 21:29:21 | 3:42:03 | 3:31:49 | 0:10:14 | smithi | master | rhel | 7.5 | fs/thrash/{begin.yaml ceph-thrash/default.yaml clusters/1-mds-1-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml msgr-failures/osd-mds-delay.yaml objectstore-ec/filestore-xfs.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_snaptests.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi172.front.sepia.ceph.com: ['type=AVC msg=audit(1566588534.653:14298): avc: denied { open } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583134.175:3081): avc: denied { ioctl } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566588534.653:14299): avc: denied { ioctl } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566584934.454:9018): avc: denied { open } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566590339.778:16795): avc: denied { read } for pid=22030 comm="smartd" name="nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566593940.102:21987): avc: denied { open } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566584934.454:9018): avc: denied { read } for pid=22030 comm="smartd" name="nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566588534.653:14298): avc: denied { read } for pid=22030 comm="smartd" name="nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583134.175:3080): avc: denied { read } for pid=22030 comm="smartd" name="nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566584934.454:9019): avc: denied { ioctl } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583134.175:3080): avc: denied { open } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566593940.102:21987): avc: denied { read } for pid=22030 comm="smartd" name="nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566593940.102:21988): avc: denied { ioctl } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566592139.935:19361): avc: denied { ioctl } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566586734.512:11873): avc: denied { open } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566590339.778:16796): avc: denied { ioctl } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566586734.512:11873): avc: denied { read } for pid=22030 comm="smartd" name="nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566592139.935:19360): avc: denied { read } for pid=22030 comm="smartd" name="nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566590339.778:16795): avc: denied { open } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566586734.512:11874): avc: denied { ioctl } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566592139.935:19360): avc: denied { open } for pid=22030 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=10768 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
fail | 4245657 | 2019-08-23 15:21:39 | 2019-08-23 17:48:56 | 2019-08-23 18:12:56 | 0:24:00 | 0:16:10 | 0:07:50 | smithi | master | rhel | 7.5 | fs/permission/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi132.front.sepia.ceph.com: ['type=AVC msg=audit(1566583111.030:3056): avc: denied { ioctl } for pid=21896 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=11339 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583111.030:3055): avc: denied { read } for pid=21896 comm="smartd" name="nvme0" dev="devtmpfs" ino=11339 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583111.030:3055): avc: denied { open } for pid=21896 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=11339 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
pass | 4245658 | 2019-08-23 15:21:40 | 2019-08-23 17:49:12 | 2019-08-23 19:25:12 | 1:36:00 | 0:26:28 | 1:09:32 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cephfs_scrub_tests.yaml} | 2 | |
fail | 4245659 | 2019-08-23 15:21:41 | 2019-08-23 17:49:29 | 2019-08-23 18:15:28 | 0:25:59 | 0:15:13 | 0:10:46 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_quota.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi181.front.sepia.ceph.com: ['type=AVC msg=audit(1566583283.543:3070): avc: denied { ioctl } for pid=21951 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1479 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583283.543:3069): avc: denied { read } for pid=21951 comm="smartd" name="nvme0" dev="devtmpfs" ino=1479 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583283.543:3069): avc: denied { open } for pid=21951 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1479 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
fail | 4245660 | 2019-08-23 15:21:42 | 2019-08-23 17:50:33 | 2019-08-23 18:24:32 | 0:33:59 | 0:25:55 | 0:08:04 | smithi | master | rhel | 7.5 | fs/basic_workload/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml omap_limit/10000.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_blogbench.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi194.front.sepia.ceph.com: ['type=AVC msg=audit(1566583181.061:3056): avc: denied { read } for pid=21887 comm="smartd" name="nvme0" dev="devtmpfs" ino=1520 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583181.061:3057): avc: denied { ioctl } for pid=21887 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1520 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583181.061:3056): avc: denied { open } for pid=21887 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1520 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
pass | 4245661 | 2019-08-23 15:21:43 | 2019-08-23 17:50:56 | 2019-08-23 21:06:58 | 3:16:02 | 3:08:20 | 0:07:42 | smithi | master | centos | fs/verify/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml overrides/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 2 | ||
pass | 4245662 | 2019-08-23 15:21:43 | 2019-08-23 17:52:48 | 2019-08-23 23:50:54 | 5:58:06 | 0:08:54 | 5:49:12 | smithi | master | ubuntu | 18.04 | fs/multiclient/{begin.yaml clusters/1-mds-3-client.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} distros/ubuntu_latest.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} tasks/ior-shared-file.yaml} | 5 | |
fail | 4245663 | 2019-08-23 15:21:44 | 2019-08-23 17:52:51 | 2019-08-23 18:34:50 | 0:41:59 | 0:30:36 | 0:11:23 | smithi | master | ubuntu | 16.04 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{ubuntu_16.04.yaml} tasks/client-recovery.yaml} | 2 | |
Failure Reason:
"2019-08-23 18:29:36.873808 mon.b (mon.0) 1321 : cluster [WRN] Health check failed: 1 clients failing to respond to capability release (MDS_CLIENT_LATE_RELEASE)" in cluster log |
||||||||||||||
pass | 4245664 | 2019-08-23 15:21:45 | 2019-08-23 17:52:53 | 2019-08-23 18:58:53 | 1:06:00 | 0:39:16 | 0:26:44 | smithi | master | rhel | 7.5 | fs/basic_workload/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp.yaml omap_limit/10000.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_ffsb.yaml} | 2 | |
pass | 4245665 | 2019-08-23 15:21:46 | 2019-08-23 17:52:55 | 2019-08-23 18:46:55 | 0:54:00 | 0:20:29 | 0:33:31 | smithi | master | rhel | 7.5 | fs/32bits/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{faked-ino.yaml frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 2 | |
fail | 4245666 | 2019-08-23 15:21:47 | 2019-08-23 17:52:56 | 2019-08-23 18:46:56 | 0:54:00 | 0:47:30 | 0:06:30 | smithi | master | rhel | 7.5 | fs/multifs/{begin.yaml clusters/1a3s-mds-2c-client.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/failover.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi202.front.sepia.ceph.com: ['type=AVC msg=audit(1566585056.416:15989): avc: denied { open } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1572 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585056.416:15989): avc: denied { read } for pid=21941 comm="smartd" name="nvme0" dev="devtmpfs" ino=1572 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583251.238:3063): avc: denied { read } for pid=21941 comm="smartd" name="nvme0" dev="devtmpfs" ino=1572 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583251.238:3063): avc: denied { open } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1572 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583251.238:3064): avc: denied { ioctl } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1572 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585056.416:15990): avc: denied { ioctl } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1572 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
fail | 4245667 | 2019-08-23 15:21:48 | 2019-08-23 17:53:01 | 2019-08-23 19:25:02 | 1:32:01 | 1:18:50 | 0:13:11 | smithi | master | rhel | 7.5 | fs/snaps/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/snaptests.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi106.front.sepia.ceph.com: ['type=AVC msg=audit(1566587279.296:7232): avc: denied { ioctl } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1522 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585481.134:6840): avc: denied { ioctl } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1522 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585481.134:6839): avc: denied { open } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1522 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585481.134:6839): avc: denied { read } for pid=21941 comm="smartd" name="nvme0" dev="devtmpfs" ino=1522 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583679.861:3079): avc: denied { open } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1522 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566587279.296:7231): avc: denied { read } for pid=21941 comm="smartd" name="nvme0" dev="devtmpfs" ino=1522 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583679.861:3079): avc: denied { read } for pid=21941 comm="smartd" name="nvme0" dev="devtmpfs" ino=1522 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566587279.296:7231): avc: denied { open } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1522 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583679.861:3080): avc: denied { ioctl } for pid=21941 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=1522 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
fail | 4245668 | 2019-08-23 15:21:49 | 2019-08-23 17:54:52 | 2019-08-23 18:38:51 | 0:43:59 | 0:29:04 | 0:14:55 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/damage.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi130.front.sepia.ceph.com: ['type=AVC msg=audit(1566583840.399:3063): avc: denied { read } for pid=21936 comm="smartd" name="nvme0" dev="devtmpfs" ino=12500 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583840.399:3064): avc: denied { ioctl } for pid=21936 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=12500 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583840.399:3063): avc: denied { open } for pid=21936 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=12500 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
fail | 4245669 | 2019-08-23 15:21:49 | 2019-08-23 17:54:52 | 2019-08-23 18:26:51 | 0:31:59 | 0:15:35 | 0:16:24 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/mds-flush.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi162.front.sepia.ceph.com: ['type=AVC msg=audit(1566583909.591:3062): avc: denied { read } for pid=21940 comm="smartd" name="nvme0" dev="devtmpfs" ino=9596 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583909.591:3063): avc: denied { ioctl } for pid=21940 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=9596 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583909.591:3062): avc: denied { open } for pid=21940 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=9596 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
pass | 4245670 | 2019-08-23 15:21:50 | 2019-08-23 17:55:04 | 2019-08-23 18:51:03 | 0:55:59 | 0:36:45 | 0:19:14 | smithi | master | rhel | 7.5 | fs/basic_workload/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml omap_limit/10000.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_iogen.yaml} | 2 | |
fail | 4245671 | 2019-08-23 15:21:51 | 2019-08-23 17:55:29 | 2019-08-23 18:21:28 | 0:25:59 | 0:17:00 | 0:08:59 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/pool-perm.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi059.front.sepia.ceph.com: ['type=AVC msg=audit(1566583536.311:3057): avc: denied { ioctl } for pid=21873 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=13429 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583536.311:3056): avc: denied { read } for pid=21873 comm="smartd" name="nvme0" dev="devtmpfs" ino=13429 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583536.311:3056): avc: denied { open } for pid=21873 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=13429 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
pass | 4245672 | 2019-08-23 15:21:52 | 2019-08-23 17:58:54 | 2019-08-23 21:10:56 | 3:12:02 | 2:51:25 | 0:20:37 | smithi | master | centos | fs/verify/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 2 | ||
pass | 4245673 | 2019-08-23 15:21:53 | 2019-08-23 17:58:54 | 2019-08-23 18:32:53 | 0:33:59 | 0:19:31 | 0:14:28 | smithi | master | rhel | 7.5 | fs/32bits/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{faked-ino.yaml frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_fsstress.yaml} | 2 | |
fail | 4245674 | 2019-08-23 15:21:54 | 2019-08-23 17:58:55 | 2019-08-23 18:56:55 | 0:58:00 | 0:47:31 | 0:10:29 | smithi | master | rhel | 7.5 | fs/multifs/{begin.yaml clusters/1a3s-mds-2c-client.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/failover.yaml} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi186.front.sepia.ceph.com: ['type=AVC msg=audit(1566583808.798:3063): avc: denied { ioctl } for pid=21944 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=8140 ioctlcmd=4e40 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585608.967:8015): avc: denied { ioctl } for pid=21944 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=8140 ioctlcmd=4e41 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583808.798:3062): avc: denied { open } for pid=21944 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=8140 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585608.967:8014): avc: denied { read } for pid=21944 comm="smartd" name="nvme0" dev="devtmpfs" ino=8140 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566585608.967:8014): avc: denied { open } for pid=21944 comm="smartd" path="/dev/nvme0" dev="devtmpfs" ino=8140 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1', 'type=AVC msg=audit(1566583808.798:3062): avc: denied { read } for pid=21944 comm="smartd" name="nvme0" dev="devtmpfs" ino=8140 scontext=system_u:system_r:fsdaemon_t:s0 tcontext=system_u:object_r:nvme_device_t:s0 tclass=chr_file permissive=1'] |
||||||||||||||
pass | 4245675 | 2019-08-23 15:21:54 | 2019-08-23 18:01:16 | 2019-08-23 18:41:16 | 0:40:00 | 0:14:59 | 0:25:01 | smithi | master | rhel | 7.5 | fs/basic_workload/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml omap_limit/10.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_truncate_delay.yaml} | 2 | |
pass | 4245676 | 2019-08-23 15:21:55 | 2019-08-23 18:02:35 | 2019-08-23 18:52:34 | 0:49:59 | 0:31:44 | 0:18:15 | smithi | master | ubuntu | 16.04 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{ubuntu_16.04.yaml} tasks/strays.yaml} | 2 | |
pass | 4245677 | 2019-08-23 15:21:56 | 2019-08-23 18:03:04 | 2019-08-23 18:41:03 | 0:37:59 | 0:22:39 | 0:15:20 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/test_journal_migration.yaml} | 2 | |
pass | 4245678 | 2019-08-23 15:21:57 | 2019-08-23 18:05:08 | 2019-08-23 18:47:07 | 0:41:59 | 0:30:52 | 0:11:07 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/volume-client/{py/2.yaml task/test/{test.yaml}}} | 2 | |
pass | 4245679 | 2019-08-23 15:21:58 | 2019-08-23 18:05:08 | 2019-08-23 18:55:07 | 0:49:59 | 0:39:11 | 0:10:48 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/volumes.yaml} | 2 | |
pass | 4245680 | 2019-08-23 15:21:59 | 2019-08-23 18:06:56 | 2019-08-23 18:42:56 | 0:36:00 | 0:15:40 | 0:20:20 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/alternate-pool.yaml} | 2 | |
fail | 4245681 | 2019-08-23 15:21:59 | 2019-08-23 18:06:56 | 2019-08-23 18:08:55 | 0:01:59 | 0 | smithi | master | fs/bugs/conf/{client.yaml mds.yaml mon.yaml osd.yaml} | — | ||||
Failure Reason:
list index out of range |
||||||||||||||
fail | 4245682 | 2019-08-23 15:22:00 | 2019-08-23 18:07:07 | 2019-08-23 20:49:08 | 2:42:01 | 0:12:45 | 2:29:16 | smithi | master | ubuntu | 18.04 | fs/multiclient/{begin.yaml clusters/1-mds-3-client.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} distros/ubuntu_latest.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} tasks/cephfs_misc_tests.yaml} | 5 | |
Failure Reason:
Test failure: test_drop_cache_command_dead (tasks.cephfs.test_misc.TestCacheDrop) |
||||||||||||||
pass | 4245683 | 2019-08-23 15:22:01 | 2019-08-23 18:09:10 | 2019-08-23 18:47:10 | 0:38:00 | 0:28:32 | 0:09:28 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/asok_dump_tree.yaml} | 2 | |
pass | 4245684 | 2019-08-23 15:22:02 | 2019-08-23 18:10:57 | 2019-08-23 18:38:56 | 0:27:59 | 0:15:50 | 0:12:09 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/backtrace.yaml} | 2 | |
pass | 4245685 | 2019-08-23 15:22:03 | 2019-08-23 18:10:57 | 2019-08-23 18:54:56 | 0:43:59 | 0:27:44 | 0:16:15 | smithi | master | rhel | 7.5 | fs/basic_workload/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/bluestore-comp-ec-root.yaml omap_limit/10000.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_blogbench.yaml} | 2 | |
pass | 4245686 | 2019-08-23 15:22:03 | 2019-08-23 18:11:06 | 2019-08-23 21:25:09 | 3:14:03 | 3:03:09 | 0:10:54 | smithi | master | centos | fs/verify/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{frag_enable.yaml mon-debug.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} tasks/cfuse_workunit_suites_fsstress.yaml validater/valgrind.yaml} | 2 | ||
fail | 4245687 | 2019-08-23 15:22:04 | 2019-08-23 15:58:50 | 2019-08-23 16:52:49 | 0:53:59 | 0:38:28 | 0:15:31 | smithi | master | centos | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{centos_latest.yaml} tasks/client-recovery.yaml} | 2 | |
Failure Reason:
"2019-08-23 16:45:35.651028 mon.b (mon.0) 1342 : cluster [WRN] Health check failed: 1 clients failing to respond to capability release (MDS_CLIENT_LATE_RELEASE)" in cluster log |
||||||||||||||
pass | 4245688 | 2019-08-23 15:22:05 | 2019-08-23 18:11:07 | 2019-08-23 18:45:07 | 0:34:00 | 0:16:05 | 0:17:55 | smithi | master | rhel | 7.5 | fs/permission/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/bluestore-comp.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 2 | |
pass | 4245689 | 2019-08-23 15:22:06 | 2019-08-23 18:13:09 | 2019-08-23 19:59:10 | 1:46:01 | 1:12:27 | 0:33:34 | smithi | master | rhel | 7.5 | fs/snaps/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/snaptests.yaml} | 2 | |
pass | 4245690 | 2019-08-23 15:22:07 | 2019-08-23 18:13:10 | 2019-08-23 19:17:10 | 1:04:00 | 0:35:51 | 0:28:09 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/mds-full.yaml} | 2 | |
pass | 4245691 | 2019-08-23 15:22:07 | 2019-08-23 18:13:09 | 2019-08-23 19:03:09 | 0:50:00 | 0:16:47 | 0:33:13 | smithi | master | rhel | 7.5 | fs/basic_workload/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/no.yaml mount/fuse.yaml objectstore-ec/filestore-xfs.yaml omap_limit/10000.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 2 | |
pass | 4245692 | 2019-08-23 15:22:08 | 2019-08-23 18:13:10 | 2019-08-23 18:39:09 | 0:25:59 | 0:16:01 | 0:09:58 | smithi | master | rhel | 7.5 | fs/32bits/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore-ec/filestore-xfs.yaml overrides/{faked-ino.yaml frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_pjd.yaml} | 2 | |
pass | 4245693 | 2019-08-23 15:22:09 | 2019-08-23 18:15:09 | 2019-08-23 18:53:08 | 0:37:59 | 0:14:58 | 0:23:01 | smithi | master | rhel | 7.5 | fs/basic_workload/{begin.yaml clusters/fixed-2-ucephfs.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} inline/yes.yaml mount/fuse.yaml objectstore-ec/bluestore-bitmap.yaml omap_limit/10.yaml overrides/{frag_enable.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/cfuse_workunit_suites_truncate_delay.yaml} | 2 | |
fail | 4245694 | 2019-08-23 15:22:10 | 2019-08-23 18:15:09 | 2019-08-23 19:05:08 | 0:49:59 | 0:20:23 | 0:29:36 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/strays.yaml} | 2 | |
Failure Reason:
Test failure: test_files_throttle (tasks.cephfs.test_strays.TestStrays) |
||||||||||||||
pass | 4245695 | 2019-08-23 15:22:11 | 2019-08-23 18:15:09 | 2019-08-23 19:09:08 | 0:53:59 | 0:30:15 | 0:23:44 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-ec-root.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/volume-client/{py/2.yaml task/test/{test.yaml}}} | 2 | |
pass | 4245696 | 2019-08-23 15:22:11 | 2019-08-23 18:15:09 | 2019-08-23 19:19:09 | 1:04:00 | 0:40:05 | 0:23:55 | smithi | master | rhel | 7.5 | fs/basic_functional/{begin.yaml clusters/1-mds-4-client-coloc.yaml conf/{client.yaml mds.yaml mon.yaml osd.yaml} mount/fuse.yaml objectstore/bluestore-bitmap.yaml overrides/{frag_enable.yaml no_client_pidfile.yaml whitelist_health.yaml whitelist_wrongly_marked_down.yaml} supported-random-distros$/{rhel_latest.yaml} tasks/volumes.yaml} | 2 |