User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
vshankar | 2023-01-30 10:51:51 | 2023-01-30 10:56:16 | 2023-01-30 18:50:34 | 7:54:18 | fs | wip-vshankar-testing-20230125.055346 | smithi | 860d3ac | 2 | 35 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 7143789 | 2023-01-30 10:51:56 | 2023-01-30 10:56:16 | 2023-01-30 17:46:17 | 6:50:01 | 6:36:53 | 0:13:08 | smithi | main | centos | 8.stream | fs/verify/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{centos_8} mount/fuse objectstore-ec/bluestore-comp-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down mon-debug session_timeout} ranks/3 tasks/fsstress validater/valgrind} | 2 | |
Failure Reason:
Command failed (workunit test suites/fsstress.sh) on smithi037 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=860d3ac8d012f04a6667eb411e085656deb62181 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh' |
||||||||||||||
fail | 7143790 | 2023-01-30 10:51:57 | 2023-01-30 10:57:17 | 2023-01-30 11:28:32 | 0:31:15 | 0:19:04 | 0:12:11 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/yes overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/v16.2.4 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
Failure Reason:
Command failed on smithi098 with status 22: "sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v16.2.4 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid fcb5a934-a08f-11ed-9e56-001a4aab830c -e sha1=860d3ac8d012f04a6667eb411e085656deb62181 -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr'" |
||||||||||||||
fail | 7143791 | 2023-01-30 10:51:58 | 2023-01-30 10:57:27 | 2023-01-30 11:42:57 | 0:45:30 | 0:27:39 | 0:17:51 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/no pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/quincy}} | 3 | |
Failure Reason:
"2023-01-30T11:30:07.352835+0000 mgr.y (mgr.14145) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
fail | 7143792 | 2023-01-30 10:51:59 | 2023-01-30 11:03:18 | 2023-01-30 11:42:49 | 0:39:31 | 0:33:27 | 0:06:04 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/legacy wsync/yes} objectstore-ec/bluestore-comp-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/always} standby-replay tasks/{0-subvolume/{with-no-extra-options} 1-check-counter 2-scrub/no 3-snaps/yes 4-flush/yes 5-workunit/suites/pjd}} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi077 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=860d3ac8d012f04a6667eb411e085656deb62181 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 7143793 | 2023-01-30 10:51:59 | 2023-01-30 11:04:09 | 2023-01-30 11:41:40 | 0:37:31 | 0:24:07 | 0:13:24 | smithi | main | rhel | 8.6 | fs/thrash/workloads/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{rhel_8} mount/kclient/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} msgr-failures/none objectstore-ec/bluestore-ec-root overrides/{frag ignorelist_health ignorelist_wrongly_marked_down prefetch_dirfrags/no prefetch_entire_dirfrags/no races session_timeout thrashosds-health} ranks/3 tasks/{1-thrash/mds 2-workunit/suites/pjd}} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi179.front.sepia.ceph.com: ['type=AVC msg=audit(1675077839.912:14789): avc: denied { read } for pid=67613 comm="rpm" name="Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675077839.912:14791): avc: denied { map } for pid=67613 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=262275 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675077325.588:199): avc: denied { read } for pid=1489 comm="rpm" name="Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675077325.589:200): avc: denied { lock } for pid=1489 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675077839.209:14772): avc: denied { search } for pid=1408 comm=72733A6D61696E20513A526567 name="cephtest" dev="sda1" ino=524293 scontext=system_u:system_r:syslogd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1675077839.912:14790): avc: denied { lock } for pid=67613 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675077325.589:201): avc: denied { map } for pid=1489 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=262275 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675077325.588:199): avc: denied { open } for pid=1489 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675077839.912:14789): avc: denied { open } for pid=67613 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1'] |
||||||||||||||
fail | 7143794 | 2023-01-30 10:52:00 | 2023-01-30 11:07:09 | 2023-01-30 11:36:53 | 0:29:44 | 0:19:09 | 0:10:35 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/no overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/v16.2.4 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
Failure Reason:
Command failed on smithi107 with status 22: "sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v16.2.4 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 44758a9a-a091-11ed-9e56-001a4aab830c -e sha1=860d3ac8d012f04a6667eb411e085656deb62181 -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr'" |
||||||||||||||
fail | 7143795 | 2023-01-30 10:52:01 | 2023-01-30 11:07:30 | 2023-01-30 11:52:47 | 0:45:17 | 0:31:17 | 0:14:00 | smithi | main | ubuntu | 20.04 | fs/mirror-ha/{begin/{0-install 1-ceph 2-logrotate} cephfs-mirror/three-per-cluster clients/{mirror} cluster/{1-node} objectstore/bluestore-bitmap overrides/{whitelist_health} supported-random-distro$/{ubuntu_latest} workloads/cephfs-mirror-ha-workunit} | 1 | |
Failure Reason:
reached maximum tries (50) after waiting for 300 seconds |
||||||||||||||
pass | 7143796 | 2023-01-30 10:52:02 | 2023-01-30 11:11:00 | 2023-01-30 13:51:44 | 2:40:44 | 2:29:44 | 0:11:00 | smithi | main | ubuntu | 20.04 | fs/thrash/workloads/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse msgr-failures/osd-mds-delay objectstore-ec/bluestore-bitmap overrides/{frag ignorelist_health ignorelist_wrongly_marked_down prefetch_dirfrags/yes prefetch_entire_dirfrags/yes races session_timeout thrashosds-health} ranks/5 tasks/{1-thrash/mds 2-workunit/fs/snaps}} | 2 | |
fail | 7143797 | 2023-01-30 10:52:03 | 2023-01-30 11:11:51 | 2023-01-30 11:56:59 | 0:45:08 | 0:29:45 | 0:15:23 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/no pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/no}} | 3 | |
Failure Reason:
"2023-01-30T11:41:53.301704+0000 mgr.y (mgr.24102) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
fail | 7143798 | 2023-01-30 10:52:04 | 2023-01-30 11:15:42 | 2023-01-30 11:41:54 | 0:26:12 | 0:13:45 | 0:12:27 | smithi | main | ubuntu | 20.04 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/client-limits} | 2 | |
Failure Reason:
Test failure: test_client_cache_size (tasks.cephfs.test_client_limits.TestClientLimits) |
||||||||||||||
fail | 7143799 | 2023-01-30 10:52:05 | 2023-01-30 11:18:02 | 2023-01-30 11:50:20 | 0:32:18 | 0:20:50 | 0:11:28 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/upgraded_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/no pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-client-upgrade 4-compat_client 5-client-sanity}} | 3 | |
Failure Reason:
"2023-01-30T11:41:06.993817+0000 mgr.y (mgr.14102) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
pass | 7143800 | 2023-01-30 10:52:06 | 2023-01-30 11:19:23 | 2023-01-30 12:43:49 | 1:24:26 | 1:12:47 | 0:11:39 | smithi | main | ubuntu | 20.04 | fs/fscrypt/{begin/{0-install 1-ceph 2-logrotate} bluestore-bitmap clusters/1-mds-1-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/kclient/{mount-syntax/v1 mount overrides/{distro/testing/k-testing}} overrides/{ignorelist_health ignorelist_health_more ignorelist_wrongly_marked_down pg-warn} tasks/fscrypt-dbench} | 3 | |
fail | 7143801 | 2023-01-30 10:52:07 | 2023-01-30 11:20:03 | 2023-01-30 11:51:11 | 0:31:08 | 0:19:23 | 0:11:45 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/no overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/v16.2.4 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/yes 3-inline/no 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
Failure Reason:
Command failed on smithi084 with status 22: "sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v16.2.4 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 46ae903e-a093-11ed-9e56-001a4aab830c -e sha1=860d3ac8d012f04a6667eb411e085656deb62181 -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr'" |
||||||||||||||
fail | 7143802 | 2023-01-30 10:52:07 | 2023-01-30 11:22:24 | 2023-01-30 12:21:58 | 0:59:34 | 0:47:17 | 0:12:17 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-comp omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/always} standby-replay tasks/{0-subvolume/{no-subvolume} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/suites/blogbench}} | 3 | |
Failure Reason:
SELinux denials found on ubuntu@smithi138.front.sepia.ceph.com: ['type=AVC msg=audit(1675078939.178:14788): avc: denied { open } for pid=67597 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079080.488:17099): avc: denied { lock } for pid=78188 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079080.488:17098): avc: denied { open } for pid=78188 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675078938.472:14771): avc: denied { search } for pid=1392 comm=72733A6D61696E20513A526567 name="cephtest" dev="sda1" ino=524293 scontext=system_u:system_r:syslogd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1675079080.488:17098): avc: denied { read } for pid=78188 comm="rpm" name="Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079081.681:17143): avc: denied { read } for pid=78357 comm="rpm" name="Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675078939.178:14790): avc: denied { map } for pid=67597 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=262275 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079081.681:17143): avc: denied { open } for pid=78357 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079081.681:17144): avc: denied { lock } for pid=78357 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079080.488:17100): avc: denied { map } for pid=78188 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=524340 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675078939.178:14789): avc: denied { lock } for pid=67597 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675078939.178:14788): avc: denied { read } for pid=67597 comm="rpm" name="Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079079.922:17073): avc: denied { read } for pid=705 comm="sssd" name="resolv.conf" dev="sda1" ino=524376 scontext=system_u:system_r:sssd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079081.681:17145): avc: denied { map } for pid=78357 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=524340 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1'] |
||||||||||||||
fail | 7143803 | 2023-01-30 10:52:08 | 2023-01-30 11:24:54 | 2023-01-30 12:02:02 | 0:37:08 | 0:27:21 | 0:09:47 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/quincy}} | 3 | |
Failure Reason:
"2023-01-30T11:50:41.496795+0000 mgr.y (mgr.14105) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
fail | 7143804 | 2023-01-30 10:52:09 | 2023-01-30 11:25:05 | 2023-01-30 12:11:00 | 0:45:55 | 0:29:18 | 0:16:37 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/no}} | 3 | |
Failure Reason:
"2023-01-30T11:56:30.139379+0000 mgr.x (mgr.14118) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
fail | 7143805 | 2023-01-30 10:52:10 | 2023-01-30 11:28:35 | 2023-01-30 12:02:03 | 0:33:28 | 0:15:52 | 0:17:36 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/crc wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/suites/ffsb}} | 3 | |
Failure Reason:
No module named 'tasks.cephadm' |
||||||||||||||
fail | 7143806 | 2023-01-30 10:52:11 | 2023-01-30 11:36:57 | 2023-01-30 12:02:41 | 0:25:44 | 0:12:04 | 0:13:40 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/no overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/v16.2.4 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
Failure Reason:
No module named 'tasks.cephadm' |
||||||||||||||
fail | 7143807 | 2023-01-30 10:52:12 | 2023-01-30 11:38:57 | 2023-01-30 12:04:52 | 0:25:55 | 0:13:57 | 0:11:58 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/upgraded_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-client-upgrade 4-compat_client 5-client-sanity}} | 3 | |
Failure Reason:
No module named 'tasks.mds_pre_upgrade' |
||||||||||||||
fail | 7143808 | 2023-01-30 10:52:13 | 2023-01-30 11:39:28 | 2023-01-30 12:34:32 | 0:55:04 | 0:42:00 | 0:13:04 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-comp omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/1 standby-replay tasks/{0-subvolume/{with-quota} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/suites/fsstress}} | 3 | |
Failure Reason:
SELinux denials found on ubuntu@smithi008.front.sepia.ceph.com: ['type=AVC msg=audit(1675079952.190:14784): avc: denied { read } for pid=67555 comm="rpm" name="Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079952.190:14785): avc: denied { lock } for pid=67555 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080098.114:17099): avc: denied { lock } for pid=78161 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524339 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079951.438:14767): avc: denied { search } for pid=1354 comm=72733A6D61696E20513A526567 name="cephtest" dev="sda1" ino=524293 scontext=system_u:system_r:syslogd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1675080098.114:17098): avc: denied { read } for pid=78161 comm="rpm" name="Packages" dev="sda1" ino=524339 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080097.538:17071): avc: denied { read } for pid=649 comm="sssd" name="resolv.conf" dev="sda1" ino=524376 scontext=system_u:system_r:sssd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079952.190:14784): avc: denied { open } for pid=67555 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079952.190:14786): avc: denied { map } for pid=67555 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=262275 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080098.114:17098): avc: denied { open } for pid=78161 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524339 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080098.114:17100): avc: denied { map } for pid=78161 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=524340 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1'] |
||||||||||||||
fail | 7143809 | 2023-01-30 10:52:13 | 2023-01-30 11:41:48 | 2023-01-30 12:04:29 | 0:22:41 | 0:09:03 | 0:13:38 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/no pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/quincy}} | 3 | |
Failure Reason:
Command failed on smithi111 with status 1: 'sudo yum -y install cephfs-java' |
||||||||||||||
fail | 7143810 | 2023-01-30 10:52:14 | 2023-01-30 11:42:59 | 2023-01-30 12:02:56 | 0:19:57 | 0:08:43 | 0:11:14 | smithi | main | ubuntu | 20.04 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/pool-perm} | 2 | |
Failure Reason:
No module named 'tasks.ceph' |
||||||||||||||
fail | 7143811 | 2023-01-30 10:52:15 | 2023-01-30 11:42:59 | 2023-01-30 12:02:56 | 0:19:57 | 0:08:46 | 0:11:11 | smithi | main | ubuntu | 20.04 | fs/libcephfs/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-1-client-coloc conf/{client mds mon osd} distro/{ubuntu_latest} objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/libcephfs/{frag test}} | 2 | |
Failure Reason:
No module named 'tasks.ceph' |
||||||||||||||
fail | 7143812 | 2023-01-30 10:52:16 | 2023-01-30 11:43:00 | 2023-01-30 13:13:44 | 1:30:44 | 1:12:40 | 0:18:04 | smithi | main | rhel | 8.6 | fs/snaps/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-1c-client conf/{client mds mon osd} distro/{rhel_8} mount/fuse objectstore-ec/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/workunit/snaps} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi084.front.sepia.ceph.com: ['type=AVC msg=audit(1675079981.842:208): avc: denied { lock } for pid=1508 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.120:20177): avc: denied { lock } for pid=108178 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=524362 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080426.638:17114): avc: denied { read } for pid=78240 comm="rpm" name="Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080426.256:17083): avc: denied { read } for pid=732 comm="sssd" name="resolv.conf" dev="sda1" ino=524376 scontext=system_u:system_r:sssd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080317.674:14812): avc: denied { map } for pid=67709 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=262275 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.658:20181): avc: denied { lock } for pid=108332 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080426.638:17114): avc: denied { open } for pid=78240 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.658:20180): avc: denied { read } for pid=108332 comm="rpm" name="Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080426.638:17115): avc: denied { lock } for pid=78240 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.119:20176): avc: denied { read write } for pid=108178 comm="rhsmcertd-worke" name=".dbenv.lock" dev="sda1" ino=524362 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079981.842:207): avc: denied { open } for pid=1508 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.658:20182): avc: denied { map } for pid=108332 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=524340 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080426.639:17116): avc: denied { map } for pid=78240 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=524340 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080317.673:14810): avc: denied { open } for pid=67709 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080317.673:14811): avc: denied { lock } for pid=67709 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.119:20176): avc: denied { open } for pid=108178 comm="rhsmcertd-worke" path="/var/lib/rpm/.dbenv.lock" dev="sda1" ino=524362 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.120:20178): avc: denied { getattr } for pid=108178 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=524363 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675082643.074:20705): avc: denied { search } for pid=1426 comm=72733A6D61696E20513A526567 name="cephtest" dev="sda1" ino=524301 scontext=system_u:system_r:syslogd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1675082646.932:20706): avc: denied { read } for pid=111966 comm="rpm" name="Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675082646.932:20706): avc: denied { open } for pid=111966 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675082646.932:20707): avc: denied { lock } for pid=111966 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079981.842:207): avc: denied { read } for pid=1508 comm="rpm" name="Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080797.120:20179): avc: denied { map } for pid=108178 comm="rhsmcertd-worke" path="/var/lib/rpm/__db.001" dev="sda1" ino=524363 scontext=system_u:system_r:rhsmcertd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675079981.842:209): avc: denied { map } for pid=1508 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=262275 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080317.673:14810): avc: denied { read } for pid=67709 comm="rpm" name="Packages" dev="sda1" ino=262274 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080317.041:14781): avc: denied { search } for pid=1426 comm=72733A6D61696E20513A526567 name="cephtest" dev="sda1" ino=524301 scontext=system_u:system_r:syslogd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1675080797.658:20180): avc: denied { open } for pid=108332 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524330 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675082646.932:20708): avc: denied { map } for pid=111966 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=524340 scontext=system_u:system_r:setroubleshootd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1'] |
||||||||||||||
fail | 7143813 | 2023-01-30 10:52:17 | 2023-01-30 11:51:21 | 2023-01-30 12:20:17 | 0:28:56 | 0:09:59 | 0:18:57 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/no pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/no}} | 3 | |
Failure Reason:
No module named 'tasks' |
||||||||||||||
fail | 7143814 | 2023-01-30 10:52:18 | 2023-01-30 11:57:02 | 2023-01-30 12:21:50 | 0:24:48 | 0:11:52 | 0:12:56 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/no overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/v16.2.4 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/yes 3-inline/no 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
Failure Reason:
No module named 'tasks' |
||||||||||||||
fail | 7143815 | 2023-01-30 10:52:19 | 2023-01-30 11:57:02 | 2023-01-30 12:35:25 | 0:38:23 | 0:21:14 | 0:17:09 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/upgraded_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/no pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-client-upgrade 4-compat_client 5-client-sanity}} | 3 | |
Failure Reason:
"2023-01-30T12:26:08.004329+0000 mgr.y (mgr.14099) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
fail | 7143816 | 2023-01-30 10:52:20 | 2023-01-30 12:02:36 | 2023-01-30 18:50:34 | 6:47:58 | 6:37:21 | 0:10:37 | smithi | main | ubuntu | 20.04 | fs/verify/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu/{latest overrides}} mount/fuse objectstore-ec/bluestore-comp-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down mon-debug session_timeout} ranks/5 tasks/fsstress validater/valgrind} | 2 | |
Failure Reason:
Command failed (workunit test suites/fsstress.sh) on smithi042 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=860d3ac8d012f04a6667eb411e085656deb62181 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh' |
||||||||||||||
fail | 7143817 | 2023-01-30 10:52:20 | 2023-01-30 12:02:36 | 2023-01-30 12:52:38 | 0:50:02 | 0:38:02 | 0:12:00 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/yes overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/pacific 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi039.front.sepia.ceph.com: ['type=AVC msg=audit(1675081304.453:16992): avc: denied { open } for pid=89236 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524323 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080862.427:214): avc: denied { map } for pid=1586 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=98535 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081163.460:14035): avc: denied { read } for pid=72336 comm="rpm" name="Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080862.405:212): avc: denied { open } for pid=1586 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081162.726:14032): avc: denied { search } for pid=1473 comm=72733A6D61696E20513A526567 name="cephtest" dev="sda1" ino=524301 scontext=system_u:system_r:syslogd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=dir permissive=1', 'type=AVC msg=audit(1675081304.453:16994): avc: denied { map } for pid=89236 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=524324 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080862.405:212): avc: denied { read } for pid=1586 comm="rpm" name="Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081163.460:14036): avc: denied { lock } for pid=72336 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081304.058:16961): avc: denied { read } for pid=711 comm="sssd" name="resolv.conf" dev="sda1" ino=524363 scontext=system_u:system_r:sssd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081163.460:14037): avc: denied { map } for pid=72336 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=98535 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080862.413:213): avc: denied { lock } for pid=1586 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081163.460:14035): avc: denied { open } for pid=72336 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081304.453:16992): avc: denied { read } for pid=89236 comm="rpm" name="Packages" dev="sda1" ino=524323 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081304.453:16993): avc: denied { lock } for pid=89236 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=524323 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1'] |
||||||||||||||
fail | 7143818 | 2023-01-30 10:52:21 | 2023-01-30 12:02:46 | 2023-01-30 12:42:00 | 0:39:14 | 0:27:53 | 0:11:21 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/quincy}} | 3 | |
Failure Reason:
"2023-01-30T12:30:00.146770+0000 mgr.y (mgr.14107) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
fail | 7143819 | 2023-01-30 10:52:22 | 2023-01-30 12:02:57 | 2023-01-30 12:29:50 | 0:26:53 | 0:14:49 | 0:12:04 | smithi | main | centos | 8.stream | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{centos_8} mount/fuse objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/workunit/quota} | 2 | |
Failure Reason:
Command failed (workunit test fs/quota/quota.sh) on smithi097 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.3/client.3/tmp && cd -- /home/ubuntu/cephtest/mnt.3/client.3/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=860d3ac8d012f04a6667eb411e085656deb62181 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="3" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.3 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.3 CEPH_MNT=/home/ubuntu/cephtest/mnt.3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.3/qa/workunits/fs/quota/quota.sh' |
||||||||||||||
fail | 7143820 | 2023-01-30 10:52:23 | 2023-01-30 12:02:57 | 2023-01-30 12:55:32 | 0:52:35 | 0:39:10 | 0:13:25 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/crc wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/always} standby-replay tasks/{0-subvolume/{with-no-extra-options} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/suites/pjd}} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi111 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=860d3ac8d012f04a6667eb411e085656deb62181 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 7143821 | 2023-01-30 10:52:24 | 2023-01-30 12:04:37 | 2023-01-30 12:31:06 | 0:26:29 | 0:16:23 | 0:10:06 | smithi | main | centos | 8.stream | fs/permission/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{centos_8} mount/fuse objectstore-ec/bluestore-comp overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
SELinux denials found on ubuntu@smithi016.front.sepia.ceph.com: ['type=AVC msg=audit(1675081147.594:14034): avc: denied { open } for pid=72289 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080844.306:214): avc: denied { map } for pid=1563 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=98535 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080844.288:212): avc: denied { read } for pid=1563 comm="rpm" name="Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080844.288:212): avc: denied { open } for pid=1563 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675080844.298:213): avc: denied { lock } for pid=1563 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081147.594:14036): avc: denied { map } for pid=72289 comm="rpm" path="/var/lib/rpm/Name" dev="sda1" ino=98535 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081147.594:14034): avc: denied { read } for pid=72289 comm="rpm" name="Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081147.594:14035): avc: denied { lock } for pid=72289 comm="rpm" path="/var/lib/rpm/Packages" dev="sda1" ino=98534 scontext=system_u:system_r:setroubleshootd_t:s0 tcontext=unconfined_u:object_r:var_lib_t:s0 tclass=file permissive=1', 'type=AVC msg=audit(1675081146.931:14031): avc: denied { search } for pid=1450 comm=72733A6D61696E20513A526567 name="cephtest" dev="sda1" ino=524293 scontext=system_u:system_r:syslogd_t:s0 tcontext=unconfined_u:object_r:user_home_t:s0 tclass=dir permissive=1'] |
||||||||||||||
fail | 7143822 | 2023-01-30 10:52:25 | 2023-01-30 12:04:38 | 2023-01-30 12:53:28 | 0:48:50 | 0:29:41 | 0:19:09 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/no}} | 3 | |
Failure Reason:
"2023-01-30T12:39:42.689436+0000 mgr.y (mgr.14131) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |
||||||||||||||
fail | 7143823 | 2023-01-30 10:52:26 | 2023-01-30 12:11:09 | 2023-01-30 12:51:38 | 0:40:29 | 0:19:31 | 0:20:58 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/no overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/v16.2.4 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/yes 3-inline/yes 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
Failure Reason:
Command failed on smithi150 with status 22: "sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v16.2.4 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 74f1204e-a09b-11ed-9e56-001a4aab830c -e sha1=860d3ac8d012f04a6667eb411e085656deb62181 -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr'" |
||||||||||||||
fail | 7143824 | 2023-01-30 10:52:27 | 2023-01-30 12:20:21 | 2023-01-30 14:44:31 | 2:24:10 | 2:13:33 | 0:10:37 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-comp omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/fs/misc}} | 3 | |
Failure Reason:
error during scrub thrashing: reached maximum tries (30) after waiting for 900 seconds |
||||||||||||||
fail | 7143825 | 2023-01-30 10:52:28 | 2023-01-30 12:21:51 | 2023-01-30 12:54:59 | 0:33:08 | 0:21:44 | 0:11:24 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/upgraded_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-client-upgrade 4-compat_client 5-client-sanity}} | 3 | |
Failure Reason:
"2023-01-30T12:44:19.891748+0000 mgr.y (mgr.14100) 1 : cluster [ERR] Failed to load ceph-mgr modules: prometheus" in cluster log |