Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
pass 7204098 2023-03-11 16:28:35 2023-03-11 16:34:59 2023-03-11 17:14:58 0:39:59 0:28:37 0:11:22 smithi main centos 8.stream rados/mgr/{clusters/{2-node-mgr} debug/mgr mgr_ttl_cache/disable mon_election/connectivity random-objectstore$/{bluestore-stupid} supported-random-distro$/{centos_8} tasks/progress} 2
fail 7204099 2023-03-11 16:28:36 2023-03-11 16:35:00 2023-03-11 17:05:51 0:30:51 0:25:30 0:05:21 smithi main rhel 8.4 rados/cephadm/workunits/{0-distro/rhel_8.4_container_tools_rhel8 agent/on mon_election/connectivity task/test_nfs} 1
Failure Reason:

Test failure: test_non_existent_cluster (tasks.cephfs.test_nfs.TestNFS)

fail 7204100 2023-03-11 16:28:37 2023-03-11 16:35:00 2023-03-11 17:07:04 0:32:04 0:18:57 0:13:07 smithi main centos 8.stream rados/dashboard/{0-single-container-host debug/mgr mon_election/classic random-objectstore$/{bluestore-stupid} tasks/e2e} 2
Failure Reason:

Command failed (workunit test cephadm/test_dashboard_e2e.sh) on smithi027 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=dc4de666ef24290c5fe6309819b462bac69aab07 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_dashboard_e2e.sh'

fail 7204101 2023-03-11 16:28:38 2023-03-11 16:36:21 2023-03-11 16:55:30 0:19:09 0:06:11 0:12:58 smithi main ubuntu 20.04 rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/radosbench 3-final cluster/1-node k8s/1.21 net/host rook/master} 1
Failure Reason:

Command failed on smithi038 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull'

fail 7204102 2023-03-11 16:28:38 2023-03-11 16:39:41 2023-03-11 16:54:50 0:15:09 0:06:07 0:09:02 smithi main ubuntu 20.04 rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/radosbench 3-final cluster/1-node k8s/1.21 net/calico rook/1.7.2} 1
Failure Reason:

Command failed on smithi053 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull'

pass 7204103 2023-03-11 16:28:39 2023-03-11 16:39:42 2023-03-11 17:16:15 0:36:33 0:23:54 0:12:39 smithi main ubuntu 20.04 rados/singleton/{all/radostool mon_election/classic msgr-failures/many msgr/async-v1only objectstore/bluestore-comp-lz4 rados supported-random-distro$/{ubuntu_latest}} 1
fail 7204104 2023-03-11 16:28:40 2023-03-11 16:41:42 2023-03-11 17:21:27 0:39:45 0:30:11 0:09:34 smithi main ubuntu 20.04 rados/cephadm/workunits/{0-distro/ubuntu_20.04 agent/on mon_election/connectivity task/test_nfs} 1
Failure Reason:

Test failure: test_non_existent_cluster (tasks.cephfs.test_nfs.TestNFS)

fail 7204105 2023-03-11 16:28:41 2023-03-11 16:41:43 2023-03-11 17:04:09 0:22:26 0:12:43 0:09:43 smithi main rhel 8.4 rados/singleton/{all/test_envlibrados_for_rocksdb mon_election/connectivity msgr-failures/none msgr/async-v2only objectstore/filestore-xfs rados supported-random-distro$/{rhel_8}} 1
Failure Reason:

Command failed (workunit test rados/test_envlibrados_for_rocksdb.sh) on smithi143 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=dc4de666ef24290c5fe6309819b462bac69aab07 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test_envlibrados_for_rocksdb.sh'

fail 7204106 2023-03-11 16:28:41 2023-03-11 16:45:14 2023-03-11 17:14:38 0:29:24 0:19:22 0:10:02 smithi main centos 8.stream rados/dashboard/{0-single-container-host debug/mgr mon_election/connectivity random-objectstore$/{bluestore-bitmap} tasks/e2e} 2
Failure Reason:

Command failed (workunit test cephadm/test_dashboard_e2e.sh) on smithi082 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=dc4de666ef24290c5fe6309819b462bac69aab07 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_dashboard_e2e.sh'

fail 7204107 2023-03-11 16:28:42 2023-03-11 16:45:14 2023-03-11 17:03:37 0:18:23 0:06:26 0:11:57 smithi main ubuntu 20.04 rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/none 3-final cluster/3-node k8s/1.21 net/flannel rook/master} 3
Failure Reason:

Command failed on smithi061 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull'

fail 7204108 2023-03-11 16:28:43 2023-03-11 16:47:15 2023-03-11 17:03:07 0:15:52 0:06:13 0:09:39 smithi main ubuntu 20.04 rados/rook/smoke/{0-distro/ubuntu_20.04 0-kubeadm 0-nvme-loop 1-rook 2-workload/radosbench 3-final cluster/1-node k8s/1.21 net/host rook/1.7.2} 1
Failure Reason:

Command failed on smithi005 with status 1: 'sudo systemctl enable --now kubelet && sudo kubeadm config images pull'

fail 7204109 2023-03-11 16:28:44 2023-03-11 16:47:15 2023-03-11 17:21:53 0:34:38 0:24:29 0:10:09 smithi main centos 8.stream rados/cephadm/workunits/{0-distro/centos_8.stream_container_tools agent/on mon_election/connectivity task/test_nfs} 1
Failure Reason:

Test failure: test_non_existent_cluster (tasks.cephfs.test_nfs.TestNFS)