User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
pdhange | 2023-03-30 20:20:20 | 2023-03-30 20:25:12 | 2023-03-30 21:46:52 | 1:21:40 | rados | wip-pdhange-testing | smithi | 7e52894 | 4 | 7 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pass | 7227349 | 2023-03-30 20:21:24 | 2023-03-30 20:25:12 | 2023-03-30 20:52:42 | 0:27:30 | 0:16:25 | 0:11:05 | smithi | main | centos | 8.stream | rados/cephadm/workunits/{0-distro/centos_8.stream_container_tools agent/on mon_election/connectivity task/test_orch_cli} | 1 | |
fail | 7227350 | 2023-03-30 20:21:26 | 2023-03-30 20:27:43 | 2023-03-30 20:43:26 | 0:15:43 | 0:06:24 | 0:09:19 | smithi | main | ubuntu | 20.04 | rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/radosbench cluster/1-node k8s/1.21 net/calico rook/1.7.2} | 1 | |
Failure Reason:
Command failed on smithi031 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull' |
||||||||||||||
pass | 7227351 | 2023-03-30 20:21:27 | 2023-03-30 20:27:43 | 2023-03-30 20:48:04 | 0:20:21 | 0:13:38 | 0:06:43 | smithi | main | rhel | 8.6 | rados/cephadm/smoke-singlehost/{0-random-distro$/{rhel_8.6_container_tools_rhel8} 1-start 2-services/basic 3-final} | 1 | |
fail | 7227352 | 2023-03-30 20:21:28 | 2023-03-30 20:28:44 | 2023-03-30 20:48:56 | 0:20:12 | 0:06:33 | 0:13:39 | smithi | main | ubuntu | 20.04 | rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/none cluster/3-node k8s/1.21 net/flannel rook/master} | 3 | |
Failure Reason:
Command failed on smithi027 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull' |
||||||||||||||
fail | 7227353 | 2023-03-30 20:21:29 | 2023-03-30 20:29:04 | 2023-03-30 20:56:52 | 0:27:48 | 0:17:41 | 0:10:07 | smithi | main | centos | 8.stream | rados/cephadm/workunits/{0-distro/centos_8.stream_container_tools agent/on mon_election/connectivity task/test_cephadm} | 1 | |
Failure Reason:
SELinux denials found on ubuntu@smithi121.front.sepia.ceph.com: ['type=AVC msg=audit(1680209650.547:19731): avc: denied { ioctl } for pid=125736 comm="iptables" path="/var/lib/containers/storage/overlay/dab3c329b6b7795936cbcf776136031bf14224cf67d9abfb45de81df82ae300f/merged" dev="overlay" ino=3409326 scontext=system_u:system_r:iptables_t:s0 tcontext=system_u:object_r:container_file_t:s0:c1022,c1023 tclass=dir permissive=1'] |
||||||||||||||
pass | 7227354 | 2023-03-30 20:21:30 | 2023-03-30 20:29:45 | 2023-03-30 21:46:52 | 1:17:07 | 1:06:54 | 0:10:13 | smithi | main | centos | 8.stream | rados/dashboard/{0-single-container-host debug/mgr mon_election/connectivity random-objectstore$/{bluestore-bitmap} tasks/dashboard} | 2 | |
fail | 7227355 | 2023-03-30 20:21:31 | 2023-03-30 20:29:45 | 2023-03-30 20:45:28 | 0:15:43 | 0:06:27 | 0:09:16 | smithi | main | ubuntu | 20.04 | rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/radosbench cluster/1-node k8s/1.21 net/host rook/1.7.2} | 1 | |
Failure Reason:
Command failed on smithi053 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull' |
||||||||||||||
fail | 7227356 | 2023-03-30 20:21:32 | 2023-03-30 20:29:45 | 2023-03-30 20:53:07 | 0:23:22 | 0:13:10 | 0:10:12 | smithi | main | centos | 8.stream | rados/singleton/{all/test_envlibrados_for_rocksdb/{supported/rhel_latest test_envlibrados_for_rocksdb} mon_election/connectivity msgr-failures/few msgr/async-v2only objectstore/bluestore-comp-zstd rados supported-random-distro$/{centos_8}} | 1 | |
Failure Reason:
Command failed (workunit test rados/test_envlibrados_for_rocksdb.sh) on smithi151 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=7e52894f684e282bdfc056afaa4d8dee2fada30e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test_envlibrados_for_rocksdb.sh' |
||||||||||||||
pass | 7227357 | 2023-03-30 20:21:33 | 2023-03-30 20:29:56 | 2023-03-30 21:06:50 | 0:36:54 | 0:24:52 | 0:12:02 | smithi | main | ubuntu | 20.04 | rados/cephadm/smoke/{0-distro/ubuntu_20.04 0-nvme-loop agent/on fixed-2 mon_election/connectivity start} | 2 | |
fail | 7227358 | 2023-03-30 20:21:34 | 2023-03-30 20:29:56 | 2023-03-30 21:07:34 | 0:37:38 | 0:26:52 | 0:10:46 | smithi | main | centos | 8.stream | rados/dashboard/{0-single-container-host debug/mgr mon_election/classic random-objectstore$/{bluestore-comp-lz4} tasks/e2e} | 2 | |
Failure Reason:
Command failed (workunit test cephadm/test_dashboard_e2e.sh) on smithi098 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=7e52894f684e282bdfc056afaa4d8dee2fada30e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_dashboard_e2e.sh' |
||||||||||||||
fail | 7227359 | 2023-03-30 20:21:35 | 2023-03-30 20:30:06 | 2023-03-30 20:50:34 | 0:20:28 | 0:06:33 | 0:13:55 | smithi | main | ubuntu | 20.04 | rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/none cluster/3-node k8s/1.21 net/calico rook/master} | 3 | |
Failure Reason:
Command failed on smithi018 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull' |