User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail | Dead |
---|---|---|---|---|---|---|---|---|---|---|---|
rishabh | 2023-05-17 11:09:48 | 2023-05-18 01:48:04 | fs | wip-rishabh-2023May15-1524 | smithi | 27be512 | 25 | 40 | 5 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fail | 7276563 | 2023-05-17 11:10:10 | 2023-05-17 11:10:24 | 2023-05-17 12:29:34 | 1:19:10 | 0:55:40 | 0:23:30 | smithi | main | ubuntu | 22.04 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/kclient/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} objectstore/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/snap-schedule} | 2 | |
Failure Reason:
"1684324202.6427338 mon.a (mon.0) 404 : cluster [WRN] Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY)" in cluster log |
||||||||||||||
pass | 7276564 | 2023-05-17 11:10:11 | 2023-05-17 11:10:25 | 2023-05-17 11:56:37 | 0:46:12 | 0:22:48 | 0:23:24 | smithi | main | ubuntu | 20.04 | fs/thrash/workloads/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu_20.04} mount/fuse msgr-failures/none objectstore-ec/bluestore-ec-root overrides/{frag ignorelist_health ignorelist_wrongly_marked_down prefetch_dirfrags/yes prefetch_entire_dirfrags/no races session_timeout thrashosds-health} ranks/3 tasks/{1-thrash/mon 2-workunit/fs/trivial_sync}} | 2 | |
fail | 7276565 | 2023-05-17 11:10:11 | 2023-05-17 11:10:25 | 2023-05-17 18:13:38 | 7:03:13 | 6:44:19 | 0:18:54 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/fuse objectstore-ec/bluestore-comp-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/1 standby-replay tasks/{0-subvolume/{with-namespace-isolated-and-quota} 1-check-counter 2-scrub/no 3-snaps/yes 4-flush/no 5-workunit/suites/pjd}} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi114 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
pass | 7276566 | 2023-05-17 11:10:12 | 2023-05-17 11:10:25 | 2023-05-17 12:14:58 | 1:04:33 | 0:42:25 | 0:22:08 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/yes overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/pacific 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/yes 3-inline/yes 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
pass | 7276567 | 2023-05-17 11:10:13 | 2023-05-17 11:10:26 | 2023-05-17 11:57:33 | 0:47:07 | 0:23:24 | 0:23:43 | smithi | main | ubuntu | 22.04 | fs/32bits/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-bitmap overrides/{faked-ino ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_fsstress} | 2 | |
pass | 7276568 | 2023-05-17 11:10:13 | 2023-05-17 11:10:26 | 2023-05-17 11:58:48 | 0:48:22 | 0:26:48 | 0:21:34 | smithi | main | ubuntu | 20.04 | fs/full/{begin/{0-install 1-ceph 2-logrotate} clusters/1-node-1-mds-1-osd conf/{client mds mon osd} distro/{ubuntu_20.04} mount/fuse objectstore/bluestore-bitmap overrides overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/mgr-osd-full} | 1 | |
fail | 7276569 | 2023-05-17 11:10:14 | 2023-05-17 11:10:27 | 2023-05-17 14:44:56 | 3:34:29 | 3:21:20 | 0:13:09 | smithi | main | centos | 8.stream | fs/libcephfs/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-1-client-coloc conf/{client mds mon osd} distro/{centos_8} objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/client} | 2 | |
Failure Reason:
Command failed (workunit test client/test.sh) on smithi081 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/client/test.sh' |
||||||||||||||
pass | 7276570 | 2023-05-17 11:10:15 | 2023-05-17 11:10:27 | 2023-05-17 12:39:22 | 1:28:55 | 1:04:44 | 0:24:11 | smithi | main | ubuntu | 22.04 | fs/multiclient/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-3-client conf/{client mds mon osd} distros/ubuntu_latest mount/fuse objectstore-ec/bluestore-comp overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/cephfs_misc_tests} | 5 | |
pass | 7276571 | 2023-05-17 11:10:15 | 2023-05-17 11:10:27 | 2023-05-17 12:25:55 | 1:15:28 | 0:52:28 | 0:23:00 | smithi | main | ubuntu | 20.04 | fs/shell/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-1-client-coloc conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/cephfs-shell} | 2 | |
pass | 7276572 | 2023-05-17 11:10:16 | 2023-05-17 11:10:28 | 2023-05-17 12:40:55 | 1:30:27 | 1:06:40 | 0:23:47 | smithi | main | ubuntu | 22.04 | fs/snaps/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-comp overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/workunit/snaps} | 2 | |
fail | 7276573 | 2023-05-17 11:10:17 | 2023-05-17 11:10:28 | 2023-05-17 11:32:37 | 0:22:09 | 0:10:37 | 0:11:32 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{with-no-extra-options} 1-check-counter 2-scrub/yes 3-snaps/no 4-flush/yes 5-workunit/kernel_untar_build}} | 3 | |
Failure Reason:
{'smithi194.front.sepia.ceph.com': {'changed': False, 'msg': 'All items completed', 'results': [{'_ansible_item_label': 'zcerza', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'zcerza', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'zcerza', 'name': 'zcerza', 'state': 'absent'}, {'_ansible_item_label': 'aschoen', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'aschoen', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'aschoen', 'name': 'aschoen', 'state': 'absent'}, {'_ansible_item_label': 'andrew', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'andrew', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'andrew', 'name': 'andrew', 'state': 'absent'}, {'_ansible_item_label': 'sweil', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'sweil', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'sweil', 'name': 'sweil', 'state': 'absent'}, {'_ansible_item_label': 'brad', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'brad', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'brad', 'name': 'brad', 'state': 'absent'}, {'_ansible_item_label': 'kefu', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'kefu', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'kefu', 'name': 'kefu', 'state': 'absent'}, {'_ansible_item_label': 'shylesh', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'shylesh', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'shylesh', 'name': 'shylesh', 'state': 'absent'}, {'_ansible_item_label': 'gmeno', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'gmeno', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'gmeno', 'name': 'gmeno', 'state': 'absent'}, {'_ansible_item_label': 'alfredodeza', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'alfredodeza', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'alfredodeza', 'name': 'alfredodeza', 'state': 'absent'}, {'_ansible_item_label': 'vumrao', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'vumrao', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'vumrao', 'name': 'vumrao', 'state': 'absent'}, {'_ansible_item_label': 'trhoden', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'trhoden', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'trhoden', 'name': 'trhoden', 'state': 'absent'}, {'_ansible_item_label': 'nishtha', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'nishtha', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'nishtha', 'name': 'nishtha', 'state': 'absent'}, {'_ansible_item_label': 'yguang', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'yguang', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'yguang', 'name': 'yguang', 'state': 'absent'}, {'_ansible_item_label': 'sdieffen', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'sdieffen', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'sdieffen', 'name': 'sdieffen', 'state': 'absent'}, {'_ansible_item_label': 'brian', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'brian', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'brian', 'name': 'brian', 'state': 'absent'}, {'_ansible_item_label': 'pmcgarry', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'pmcgarry', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'pmcgarry', 'name': 'pmcgarry', 'state': 'absent'}, {'_ansible_item_label': 'karnan', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'karnan', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'karnan', 'name': 'karnan', 'state': 'absent'}, {'_ansible_item_label': 'ryneli', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'ryneli', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'ryneli', 'name': 'ryneli', 'state': 'absent'}, {'_ansible_item_label': 'dlambrig', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dlambrig', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dlambrig', 'name': 'dlambrig', 'state': 'absent'}, {'_ansible_item_label': 'icolle', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'icolle', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'icolle', 'name': 'icolle', 'state': 'absent'}, {'_ansible_item_label': 'soumya', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'soumya', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'soumya', 'name': 'soumya', 'state': 'absent'}, {'_ansible_item_label': 'jspray', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jspray', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jspray', 'name': 'jspray', 'state': 'absent'}, {'_ansible_item_label': 'erwan', 'ansible_loop_var': 'item', 'item': 'erwan', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi194.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'jj', 'ansible_loop_var': 'item', 'item': 'jj', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi194.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'amarangone', 'ansible_loop_var': 'item', 'item': 'amarangone', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi194.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'oprypin', 'ansible_loop_var': 'item', 'item': 'oprypin', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi194.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'adamyanova', 'ansible_loop_var': 'item', 'item': 'adamyanova', 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi194.front.sepia.ceph.com,172.21.15.194' (ECDSA) to the list of known hosts.\r\nubuntu@smithi194.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive).", 'unreachable': True}, {'_ansible_item_label': 'sbillah', 'ansible_loop_var': 'item', 'item': 'sbillah', 'msg': "Failed to connect to the host via ssh: Warning: Permanently added 'smithi194.front.sepia.ceph.com,172.21.15.194' (ECDSA) to the list of known hosts.\r\nubuntu@smithi194.front.sepia.ceph.com: Permission denied (publickey,password,keyboard-interactive).", 'unreachable': True}, {'_ansible_item_label': 'onyb', 'ansible_loop_var': 'item', 'item': 'onyb', 'msg': 'Failed to connect to the host via ssh: ssh: connect to host smithi194.front.sepia.ceph.com port 22: No route to host', 'unreachable': True}, {'_ansible_item_label': 'jwilliamson', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jwilliamson', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jwilliamson', 'name': 'jwilliamson', 'state': 'absent'}, {'_ansible_item_label': 'kmroz', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'kmroz', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'kmroz', 'name': 'kmroz', 'state': 'absent'}, {'_ansible_item_label': 'shehbazj', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'shehbazj', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'shehbazj', 'name': 'shehbazj', 'state': 'absent'}, {'_ansible_item_label': 'abhishekvrshny', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'abhishekvrshny', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'abhishekvrshny', 'name': 'abhishekvrshny', 'state': 'absent'}, {'_ansible_item_label': 'asheplyakov', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'asheplyakov', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'asheplyakov', 'name': 'asheplyakov', 'state': 'absent'}, {'_ansible_item_label': 'liupan', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'liupan', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'liupan', 'name': 'liupan', 'state': 'absent'}, {'_ansible_item_label': 'adeza', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'adeza', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'adeza', 'name': 'adeza', 'state': 'absent'}, {'_ansible_item_label': 'pranith', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'pranith', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'pranith', 'name': 'pranith', 'state': 'absent'}, {'_ansible_item_label': 'dorinda', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dorinda', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dorinda', 'name': 'dorinda', 'state': 'absent'}, {'_ansible_item_label': 'zyan', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'zyan', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'zyan', 'name': 'zyan', 'state': 'absent'}, {'_ansible_item_label': 'jdillaman', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jdillaman', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jdillaman', 'name': 'jdillaman', 'state': 'absent'}, {'_ansible_item_label': 'davidz', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'davidz', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'davidz', 'name': 'davidz', 'state': 'absent'}, {'_ansible_item_label': 'wusui', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'wusui', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'wusui', 'name': 'wusui', 'state': 'absent'}, {'_ansible_item_label': 'nwatkins', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'nwatkins', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'nwatkins', 'name': 'nwatkins', 'state': 'absent'}, {'_ansible_item_label': 'sidharthanup', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'sidharthanup', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'sidharthanup', 'name': 'sidharthanup', 'state': 'absent'}, {'_ansible_item_label': 'varsha', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'varsha', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'varsha', 'name': 'varsha', 'state': 'absent'}, {'_ansible_item_label': 'hmunjulu', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'hmunjulu', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'hmunjulu', 'name': 'hmunjulu', 'state': 'absent'}, {'_ansible_item_label': 'jlopez', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jlopez', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jlopez', 'name': 'jlopez', 'state': 'absent'}, {'_ansible_item_label': 'dfuller', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dfuller', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dfuller', 'name': 'dfuller', 'state': 'absent'}, {'_ansible_item_label': 'vasu', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'vasu', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'vasu', 'name': 'vasu', 'state': 'absent'}, {'_ansible_item_label': 'swagner', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'swagner', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'swagner', 'name': 'swagner', 'state': 'absent'}, {'_ansible_item_label': 'dpivonka', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'dpivonka', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'dpivonka', 'name': 'dpivonka', 'state': 'absent'}, {'_ansible_item_label': 'nlevine', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'nlevine', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'nlevine', 'name': 'nlevine', 'state': 'absent'}, {'_ansible_item_label': 'tbrekke', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'tbrekke', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'tbrekke', 'name': 'tbrekke', 'state': 'absent'}, {'_ansible_item_label': 'taco', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'taco', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'taco', 'name': 'taco', 'state': 'absent'}, {'_ansible_item_label': 'louis', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'louis', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'louis', 'name': 'louis', 'state': 'absent'}, {'_ansible_item_label': 'amarango', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'amarango', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'amarango', 'name': 'amarango', 'state': 'absent'}, {'_ansible_item_label': 'oobe', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'oobe', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'oobe', 'name': 'oobe', 'state': 'absent'}, {'_ansible_item_label': 'rturk', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'rturk', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'rturk', 'name': 'rturk', 'state': 'absent'}, {'_ansible_item_label': 'fche', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'fche', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'fche', 'name': 'fche', 'state': 'absent'}, {'_ansible_item_label': 'jbainbri', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'jbainbri', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'jbainbri', 'name': 'jbainbri', 'state': 'absent'}, {'_ansible_item_label': 'kdhananj', '_ansible_no_log': False, 'ansible_loop_var': 'item', 'changed': False, 'failed': False, 'invocation': {'module_args': {'append': False, 'authorization': None, 'comment': None, 'create_home': True, 'expires': None, 'force': False, 'generate_ssh_key': None, 'group': None, 'groups': None, 'hidden': None, 'home': None, 'local': None, 'login_class': None, 'move_home': False, 'name': 'kdhananj', 'non_unique': False, 'password': None, 'password_lock': None, 'profile': None, 'remove': False, 'role': None, 'seuser': None, 'shell': None, 'skeleton': None, 'ssh_key_bits': 0, 'ssh_key_comment': 'ansible-generated on smithi194.front.sepia.ceph.com', 'ssh_key_file': None, 'ssh_key_passphrase': None, 'ssh_key_type': 'rsa', 'state': 'absent', 'system': False, 'uid': None, 'update_password': 'always'}}, 'item': 'kdhananj', 'name': 'kdhananj', 'state': 'absent'}]}} |
||||||||||||||
fail | 7276574 | 2023-05-17 11:10:18 | 2023-05-17 11:10:29 | 2023-05-17 11:26:31 | 0:16:02 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/fuse objectstore-ec/bluestore-comp-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/default} standby-replay tasks/{0-subvolume/{no-subvolume} 1-check-counter 2-scrub/no 3-snaps/yes 4-flush/no 5-workunit/postgres}} | 3 | |||
Failure Reason:
Stale jobs detected, aborting. |
||||||||||||||
fail | 7276575 | 2023-05-17 11:10:18 | 2023-05-17 11:10:29 | 2023-05-17 11:54:10 | 0:43:41 | 0:22:16 | 0:21:25 | smithi | main | centos | 8.stream | fs/fscrypt/{begin/{0-install 1-ceph 2-logrotate} bluestore-bitmap clusters/1-mds-1-client conf/{client mds mon osd} distro/{centos_latest} mount/kclient/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} overrides/{ignorelist_health ignorelist_health_more ignorelist_wrongly_marked_down pg-warn} tasks/{0-client 1-tests/fscrypt-iozone}} | 3 | |
Failure Reason:
Command failed (workunit test fs/fscrypt.sh) on smithi163 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/fs/fscrypt.sh none iozone' |
||||||||||||||
fail | 7276576 | 2023-05-17 11:10:19 | 2023-05-17 11:10:30 | 2023-05-17 12:17:05 | 1:06:35 | 0:44:15 | 0:22:20 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/legacy wsync/yes} objectstore-ec/bluestore-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated-and-quota} 1-check-counter 2-scrub/no 3-snaps/yes 4-flush/no 5-workunit/suites/dbench}} | 3 | |
Failure Reason:
Command failed (workunit test suites/dbench.sh) on smithi072 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/dbench.sh' |
||||||||||||||
fail | 7276577 | 2023-05-17 11:10:20 | 2023-05-17 11:10:30 | 2023-05-17 18:24:14 | 7:13:44 | 6:49:28 | 0:24:16 | smithi | main | centos | 8.stream | fs/traceless/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{centos_8} mount/fuse objectstore-ec/bluestore-comp overrides/{frag ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_fsstress traceless/50pc} | 2 | |
Failure Reason:
Command failed (workunit test suites/fsstress.sh) on smithi047 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh' |
||||||||||||||
pass | 7276578 | 2023-05-17 11:10:20 | 2023-05-17 11:10:30 | 2023-05-17 11:59:34 | 0:49:04 | 0:24:02 | 0:25:02 | smithi | main | ubuntu | 22.04 | fs/thrash/workloads/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/kclient/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} msgr-failures/none objectstore-ec/bluestore-comp-ec-root overrides/{frag ignorelist_health ignorelist_wrongly_marked_down prefetch_dirfrags/no prefetch_entire_dirfrags/no races session_timeout thrashosds-health} ranks/1 tasks/{1-thrash/mds 2-workunit/suites/pjd}} | 2 | |
fail | 7276579 | 2023-05-17 11:10:21 | 2023-05-17 11:10:31 | 2023-05-17 15:01:52 | 3:51:21 | 3:32:20 | 0:19:01 | smithi | main | rhel | 8.6 | fs/thrash/workloads/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{rhel_8} mount/fuse msgr-failures/osd-mds-delay objectstore-ec/bluestore-comp overrides/{frag ignorelist_health ignorelist_wrongly_marked_down prefetch_dirfrags/yes prefetch_entire_dirfrags/yes races session_timeout thrashosds-health} ranks/3 tasks/{1-thrash/mds 2-workunit/fs/snaps}} | 2 | |
Failure Reason:
Command failed (workunit test fs/snaps/untar_snap_rm.sh) on smithi023 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/fs/snaps/untar_snap_rm.sh' |
||||||||||||||
fail | 7276580 | 2023-05-17 11:10:22 | 2023-05-17 11:10:31 | 2023-05-17 18:04:53 | 6:54:22 | 6:34:29 | 0:19:53 | smithi | main | centos | 8.stream | fs/32bits/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{centos_8} mount/fuse objectstore-ec/bluestore-comp-ec-root overrides/{faked-ino ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi039 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 7276581 | 2023-05-17 11:10:23 | 2023-05-17 11:10:32 | 2023-05-17 18:04:04 | 6:53:32 | 6:32:11 | 0:21:21 | smithi | main | rhel | 8.6 | fs/permission/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{rhel_8} mount/fuse objectstore-ec/bluestore-comp-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi042 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
pass | 7276582 | 2023-05-17 11:10:23 | 2023-05-17 11:10:32 | 2023-05-17 11:54:03 | 0:43:31 | 0:26:41 | 0:16:50 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated} 1-check-counter 2-scrub/yes 3-snaps/no 4-flush/yes 5-workunit/suites/fsync-tester}} | 3 | |
dead | 7276583 | 2023-05-17 11:10:24 | 2023-05-17 11:10:32 | 2023-05-17 23:29:21 | 12:18:49 | smithi | main | ubuntu | 22.04 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/xfstests-dev} | 2 | |||
Failure Reason:
hit max job timeout |
||||||||||||||
fail | 7276584 | 2023-05-17 11:10:25 | 2023-05-17 11:10:33 | 2023-05-17 11:43:57 | 0:33:24 | 0:19:31 | 0:13:53 | smithi | main | rhel | 8.6 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/kclient/{mount-syntax/{v1} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} objectstore/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/workunit/quota} | 2 | |
Failure Reason:
Command failed (workunit test fs/quota/quota.sh) on smithi062 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.3/client.3/tmp && cd -- /home/ubuntu/cephtest/mnt.3/client.3/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="3" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.3 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.3 CEPH_MNT=/home/ubuntu/cephtest/mnt.3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.3/qa/workunits/fs/quota/quota.sh' |
||||||||||||||
fail | 7276585 | 2023-05-17 11:10:25 | 2023-05-17 11:10:33 | 2023-05-17 11:52:49 | 0:42:16 | 0:21:21 | 0:20:55 | smithi | main | ubuntu | 22.04 | fs/multiclient/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-2-client conf/{client mds mon osd} distros/ubuntu_latest mount/fuse objectstore-ec/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/ior-shared-file} | 4 | |
Failure Reason:
Command failed on smithi097 with status 2: 'TESTDIR=/home/ubuntu/cephtest bash -s' |
||||||||||||||
pass | 7276586 | 2023-05-17 11:10:26 | 2023-05-17 12:05:26 | 2173 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/quincy}} | 3 | ||||
pass | 7276587 | 2023-05-17 11:10:27 | 2023-05-17 11:10:34 | 2023-05-17 12:25:58 | 1:15:24 | 0:53:44 | 0:21:40 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/legacy wsync/yes} objectstore-ec/bluestore-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{with-no-extra-options} 1-check-counter 2-scrub/yes 3-snaps/no 4-flush/no 5-workunit/kernel_untar_build}} | 3 | |
fail | 7276588 | 2023-05-17 11:10:28 | 2023-05-17 11:10:34 | 2023-05-17 13:28:03 | 2:17:29 | 2:01:24 | 0:16:05 | smithi | main | ubuntu | 20.04 | fs/verify/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu/{latest overrides}} mount/kclient/{k-testing mount ms-die-on-skipped} objectstore-ec/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down mon-debug session_timeout} ranks/5 tasks/dbench validater/valgrind} | 2 | |
Failure Reason:
saw valgrind issues |
||||||||||||||
fail | 7276589 | 2023-05-17 11:10:28 | 2023-05-17 11:10:35 | 2023-05-17 12:08:50 | 0:58:15 | 0:40:09 | 0:18:06 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/default} standby-replay tasks/{0-subvolume/{no-subvolume} 1-check-counter 2-scrub/no 3-snaps/yes 4-flush/yes 5-workunit/postgres}} | 3 | |
Failure Reason:
Command failed on smithi006 with status 1: "sudo TESTDIR=/home/ubuntu/cephtest bash -c 'sudo -u postgres -- pgbench -s 500 -i'" |
||||||||||||||
dead | 7276590 | 2023-05-17 11:10:29 | 2023-05-17 11:10:35 | 2023-05-17 23:33:04 | 12:22:29 | smithi | main | ubuntu | 22.04 | fs/traceless/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-comp overrides/{frag ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_blogbench traceless/50pc} | 2 | |||
Failure Reason:
hit max job timeout |
||||||||||||||
pass | 7276591 | 2023-05-17 11:10:30 | 2023-05-17 11:10:36 | 2023-05-17 12:18:54 | 1:08:18 | 0:54:20 | 0:13:58 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/legacy wsync/yes} objectstore-ec/bluestore-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/default} standby-replay tasks/{0-subvolume/{no-subvolume} 1-check-counter 2-scrub/yes 3-snaps/no 4-flush/no 5-workunit/suites/ffsb}} | 3 | |
pass | 7276592 | 2023-05-17 11:10:31 | 2023-05-17 11:10:36 | 2023-05-17 11:50:26 | 0:39:50 | 0:16:58 | 0:22:52 | smithi | main | ubuntu | 22.04 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/backtrace} | 2 | |
pass | 7276593 | 2023-05-17 11:10:31 | 2023-05-17 11:10:36 | 2023-05-17 11:49:01 | 0:38:25 | 0:23:00 | 0:15:25 | smithi | main | ubuntu | 20.04 | fs/thrash/multifs/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-2c-client conf/{client mds mon osd} distro/{ubuntu_20.04} mount/fuse msgr-failures/osd-mds-delay objectstore/bluestore-bitmap overrides/{frag ignorelist_health ignorelist_wrongly_marked_down multifs session_timeout thrashosds-health} tasks/{1-thrash/mon 2-workunit/cfuse_workunit_trivial_sync}} | 2 | |
pass | 7276594 | 2023-05-17 11:10:32 | 2023-05-17 11:10:37 | 2023-05-17 12:15:25 | 1:04:48 | 0:40:00 | 0:24:48 | smithi | main | centos | 8.stream | fs/upgrade/mds_upgrade_sequence/{bluestore-bitmap centos_8.stream_container_tools conf/{client mds mon osd} fail_fs/yes overrides/{ignorelist_health ignorelist_wrongly_marked_down pg-warn syntax} roles tasks/{0-from/pacific 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} | 2 | |
fail | 7276595 | 2023-05-17 11:10:33 | 2023-05-17 11:10:37 | 2023-05-17 11:55:31 | 0:44:54 | 0:19:41 | 0:25:13 | smithi | main | ubuntu | 22.04 | fs/multiclient/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-3-client conf/{client mds mon osd} distros/ubuntu_latest mount/fuse objectstore-ec/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/mdtest} | 5 | |
Failure Reason:
Command failed on smithi162 with status 2: 'TESTDIR=/home/ubuntu/cephtest bash -s' |
||||||||||||||
pass | 7276596 | 2023-05-17 11:10:34 | 2023-05-17 11:10:38 | 2023-05-17 11:58:16 | 0:47:38 | 0:30:33 | 0:17:05 | smithi | main | ubuntu | 20.04 | fs/multifs/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-2c-client conf/{client mds mon osd} distro/{ubuntu_20.04} mount/kclient/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} objectstore-ec/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down mon-debug} tasks/multifs-auth} | 2 | |
pass | 7276597 | 2023-05-17 11:10:35 | 2023-05-17 11:10:38 | 2023-05-17 12:16:11 | 1:05:33 | 0:48:03 | 0:17:30 | smithi | main | ubuntu | 20.04 | fs/snaps/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu_20.04} mount/kclient/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} objectstore-ec/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/workunit/snaps} | 2 | |
pass | 7276598 | 2023-05-17 11:10:35 | 2023-05-17 11:10:38 | 2023-05-17 11:53:12 | 0:42:34 | 0:25:22 | 0:17:12 | smithi | main | ubuntu | 20.04 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_20.04} mount/kclient/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} objectstore/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/client-limits} | 2 | |
fail | 7276599 | 2023-05-17 11:10:36 | 2023-05-17 17:55:53 | 23628 | smithi | main | rhel | 8.6 | fs/32bits/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{rhel_8} mount/fuse objectstore-ec/bluestore-ec-root overrides/{faked-ino ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | ||||
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi107 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 7276600 | 2023-05-17 11:10:37 | 2023-05-17 11:10:39 | 2023-05-17 18:04:18 | 6:53:39 | 6:33:55 | 0:19:44 | smithi | main | rhel | 8.6 | fs/permission/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{rhel_8} mount/fuse objectstore-ec/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi035 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 7276601 | 2023-05-17 11:10:38 | 2023-05-17 11:10:40 | 2023-05-17 13:23:13 | 2:12:33 | 1:53:05 | 0:19:28 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/fuse objectstore-ec/bluestore-comp-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/default} standby-replay tasks/{0-subvolume/{with-namespace-isolated} 1-check-counter 2-scrub/yes 3-snaps/no 4-flush/no 5-workunit/fs/misc}} | 3 | |
Failure Reason:
error during scrub thrashing: reached maximum tries (30) after waiting for 900 seconds |
||||||||||||||
dead | 7276602 | 2023-05-17 11:10:38 | 2023-05-17 11:33:01 | smithi | main | rhel | 8.6 | fs/thrash/workloads/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{rhel_8} mount/fuse msgr-failures/osd-mds-delay objectstore-ec/bluestore-bitmap overrides/{frag ignorelist_health ignorelist_wrongly_marked_down prefetch_dirfrags/no prefetch_entire_dirfrags/yes races session_timeout thrashosds-health} ranks/3 tasks/{1-thrash/osd 2-workunit/fs/snaps}} | 2 | |||||
Failure Reason:
SSH connection to smithi194 was lost: 'rm -f /tmp/kernel.x86_64.rpm && echo kernel-6.3.0_g2bcb17939031-1.x86_64.rpm | wget -nv -O /tmp/kernel.x86_64.rpm --base=https://4.chacra.ceph.com/r/kernel/testing/2bcb179390319a07fc43480888bd707374860ab3/centos/8/flavors/default/x86_64/ --input-file=-' |
||||||||||||||
fail | 7276603 | 2023-05-17 11:10:39 | 2023-05-17 11:26:33 | 2023-05-17 15:06:53 | 3:40:20 | 3:28:50 | 0:11:30 | smithi | main | ubuntu | 20.04 | fs/traceless/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{ubuntu_20.04} mount/fuse objectstore-ec/bluestore-ec-root overrides/{frag ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_dbench traceless/50pc} | 2 | |
Failure Reason:
Command failed (workunit test suites/dbench.sh) on smithi002 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/dbench.sh' |
||||||||||||||
fail | 7276604 | 2023-05-17 11:10:40 | 2023-05-17 11:26:33 | 2023-05-17 11:41:04 | 0:14:31 | smithi | main | ubuntu | 22.04 | fs/fscrypt/{begin/{0-install 1-ceph 2-logrotate} bluestore-bitmap clusters/1-mds-1-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/kclient/{mount-syntax/{v1} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} overrides/{ignorelist_health ignorelist_health_more ignorelist_wrongly_marked_down pg-warn} tasks/{0-client 1-tests/fscrypt-dbench}} | 3 | |||
Failure Reason:
Command failed on smithi194 with status 128: 'cd /lib/firmware/updates && sudo git fetch origin && sudo git reset --hard origin/main' |
||||||||||||||
dead | 7276605 | 2023-05-17 11:10:41 | 2023-05-17 11:32:45 | 2023-05-17 11:41:04 | 0:08:19 | smithi | main | ubuntu | 20.04 | fs/volumes/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_20.04} mount/kclient/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/volumes/{overrides test/clone}} | 2 | |||
Failure Reason:
SSH connection to smithi194 was lost: 'rm -f /tmp/linux-image.deb && echo linux-image-6.3.0-g2bcb17939031_6.3.0-g2bcb17939031-1_amd64.deb | wget -nv -O /tmp/linux-image.deb --base=https://2.chacra.ceph.com/r/kernel/testing/2bcb179390319a07fc43480888bd707374860ab3/ubuntu/focal/flavors/default/pool/main/l/linux-6.3.0-g2bcb17939031/ --input-file=-' |
||||||||||||||
fail | 7276606 | 2023-05-17 11:10:41 | 2023-05-17 11:33:15 | 2023-05-17 12:13:08 | 0:39:53 | 0:22:47 | 0:17:06 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/crc wsync/no} objectstore-ec/bluestore-comp omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/default} standby-replay tasks/{0-subvolume/{with-namespace-isolated} 1-check-counter 2-scrub/no 3-snaps/yes 4-flush/yes 5-workunit/suites/ffsb}} | 3 | |
Failure Reason:
No module named 'tasks.fs' |
||||||||||||||
fail | 7276607 | 2023-05-17 11:10:42 | 2023-05-17 11:44:07 | 2023-05-17 12:08:46 | 0:24:39 | 0:08:28 | 0:16:11 | smithi | main | ubuntu | 22.04 | fs/32bits/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-bitmap overrides/{faked-ino ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
Command failed on smithi202 with status 100: 'sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=18.0.0-3926-g27be5126-1jammy ceph-mds=18.0.0-3926-g27be5126-1jammy ceph-common=18.0.0-3926-g27be5126-1jammy ceph-fuse=18.0.0-3926-g27be5126-1jammy ceph-test=18.0.0-3926-g27be5126-1jammy radosgw=18.0.0-3926-g27be5126-1jammy python-ceph=18.0.0-3926-g27be5126-1jammy libcephfs1=18.0.0-3926-g27be5126-1jammy libcephfs-java=18.0.0-3926-g27be5126-1jammy libcephfs-jni=18.0.0-3926-g27be5126-1jammy librados2=18.0.0-3926-g27be5126-1jammy librbd1=18.0.0-3926-g27be5126-1jammy rbd-fuse=18.0.0-3926-g27be5126-1jammy python3-cephfs=18.0.0-3926-g27be5126-1jammy cephfs-shell=18.0.0-3926-g27be5126-1jammy cephfs-top=18.0.0-3926-g27be5126-1jammy cephfs-mirror=18.0.0-3926-g27be5126-1jammy' |
||||||||||||||
fail | 7276608 | 2023-05-17 11:10:43 | 2023-05-17 11:49:09 | 2023-05-17 12:10:22 | 0:21:13 | 0:09:36 | 0:11:37 | smithi | main | ubuntu | 22.04 | fs/permission/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
Command failed on smithi154 with status 100: 'sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=18.0.0-3926-g27be5126-1jammy ceph-mds=18.0.0-3926-g27be5126-1jammy ceph-common=18.0.0-3926-g27be5126-1jammy ceph-fuse=18.0.0-3926-g27be5126-1jammy ceph-test=18.0.0-3926-g27be5126-1jammy radosgw=18.0.0-3926-g27be5126-1jammy python-ceph=18.0.0-3926-g27be5126-1jammy libcephfs1=18.0.0-3926-g27be5126-1jammy libcephfs-java=18.0.0-3926-g27be5126-1jammy libcephfs-jni=18.0.0-3926-g27be5126-1jammy librados2=18.0.0-3926-g27be5126-1jammy librbd1=18.0.0-3926-g27be5126-1jammy rbd-fuse=18.0.0-3926-g27be5126-1jammy python3-cephfs=18.0.0-3926-g27be5126-1jammy cephfs-shell=18.0.0-3926-g27be5126-1jammy cephfs-top=18.0.0-3926-g27be5126-1jammy cephfs-mirror=18.0.0-3926-g27be5126-1jammy' |
||||||||||||||
fail | 7276609 | 2023-05-17 11:10:44 | 2023-05-17 11:50:29 | 2023-05-17 12:14:20 | 0:23:51 | 0:10:54 | 0:12:57 | smithi | main | ubuntu | 22.04 | fs/snaps/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-1c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-comp-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/workunit/snaps} | 2 | |
Failure Reason:
Command failed on smithi119 with status 100: 'sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=18.0.0-3926-g27be5126-1jammy ceph-mds=18.0.0-3926-g27be5126-1jammy ceph-common=18.0.0-3926-g27be5126-1jammy ceph-fuse=18.0.0-3926-g27be5126-1jammy ceph-test=18.0.0-3926-g27be5126-1jammy radosgw=18.0.0-3926-g27be5126-1jammy python-ceph=18.0.0-3926-g27be5126-1jammy libcephfs1=18.0.0-3926-g27be5126-1jammy libcephfs-java=18.0.0-3926-g27be5126-1jammy libcephfs-jni=18.0.0-3926-g27be5126-1jammy librados2=18.0.0-3926-g27be5126-1jammy librbd1=18.0.0-3926-g27be5126-1jammy rbd-fuse=18.0.0-3926-g27be5126-1jammy python3-cephfs=18.0.0-3926-g27be5126-1jammy cephfs-shell=18.0.0-3926-g27be5126-1jammy cephfs-top=18.0.0-3926-g27be5126-1jammy cephfs-mirror=18.0.0-3926-g27be5126-1jammy' |
||||||||||||||
pass | 7276610 | 2023-05-17 11:10:44 | 2023-05-17 13:01:56 | 2023-05-17 13:45:01 | 0:43:05 | 0:36:54 | 0:06:11 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v1} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated-and-quota} 1-check-counter 2-scrub/no 3-snaps/yes 4-flush/yes 5-workunit/suites/iogen}} | 3 | |
pass | 7276611 | 2023-05-17 11:10:45 | 2023-05-17 13:01:56 | 2023-05-17 14:10:58 | 1:09:02 | 0:56:16 | 0:12:46 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/default} standby-replay tasks/{0-subvolume/{with-namespace-isolated} 1-check-counter 2-scrub/no 3-snaps/no 4-flush/yes 5-workunit/fs/misc}} | 3 | |
fail | 7276612 | 2023-05-17 11:10:46 | 2023-05-17 13:07:37 | 2023-05-17 16:20:10 | 3:12:33 | 3:00:37 | 0:11:56 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/fuse objectstore-ec/bluestore-comp-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated-and-quota} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/kernel_untar_build}} | 3 | |
Failure Reason:
error during scrub thrashing: reached maximum tries (30) after waiting for 900 seconds |
||||||||||||||
fail | 7276613 | 2023-05-17 11:10:47 | 2023-05-17 13:08:48 | 2023-05-17 16:47:13 | 3:38:25 | 3:26:59 | 0:11:26 | smithi | main | ubuntu | 22.04 | fs/traceless/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-bitmap overrides/{frag ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_ffsb traceless/50pc} | 2 | |
Failure Reason:
Command failed (workunit test suites/ffsb.sh) on smithi123 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/ffsb.sh' |
||||||||||||||
pass | 7276614 | 2023-05-17 11:10:47 | 2023-05-17 13:09:18 | 2023-05-17 14:02:44 | 0:53:26 | 0:46:52 | 0:06:34 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/crc wsync/no} objectstore-ec/bluestore-comp omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/default} standby-replay tasks/{0-subvolume/{no-subvolume} 1-check-counter 2-scrub/no 3-snaps/no 4-flush/yes 5-workunit/postgres}} | 3 | |
fail | 7276615 | 2023-05-17 11:10:48 | 2023-05-17 13:09:59 | 2023-05-17 14:16:31 | 1:06:32 | 0:51:07 | 0:15:25 | smithi | main | ubuntu | 22.04 | fs/thrash/multifs/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-2c-client conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse msgr-failures/none objectstore/bluestore-bitmap overrides/{frag ignorelist_health ignorelist_wrongly_marked_down multifs session_timeout thrashosds-health} tasks/{1-thrash/mon 2-workunit/cfuse_workunit_snaptests}} | 2 | |
Failure Reason:
Command failed (workunit test fs/snaps/snaptest-multiple-capsnaps.sh) on smithi089 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/fs/snaps/snaptest-multiple-capsnaps.sh' |
||||||||||||||
fail | 7276616 | 2023-05-17 11:10:49 | 2023-05-17 13:14:50 | 2023-05-17 13:55:18 | 0:40:28 | 0:19:57 | 0:20:31 | smithi | main | ubuntu | 22.04 | fs/multiclient/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-3-client conf/{client mds mon osd} distros/ubuntu_latest mount/fuse objectstore-ec/bluestore-comp overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/ior-shared-file} | 5 | |
Failure Reason:
Command failed on smithi162 with status 2: 'TESTDIR=/home/ubuntu/cephtest bash -s' |
||||||||||||||
pass | 7276617 | 2023-05-17 11:10:49 | 2023-05-17 13:23:22 | 2023-05-17 14:13:47 | 0:50:25 | 0:32:24 | 0:18:01 | smithi | main | centos | 8.stream | fs/upgrade/featureful_client/old_client/{bluestore-bitmap centos_latest clusters/1-mds-2-client-micro conf/{client mds mon osd} overrides/{ignorelist_health ignorelist_wrongly_marked_down multimds/yes pg-warn} tasks/{0-octopus 1-client 2-upgrade 3-compat_client/quincy}} | 3 | |
fail | 7276618 | 2023-05-17 11:10:50 | 2023-05-17 13:26:03 | 2023-05-17 14:03:34 | 0:37:31 | 0:23:42 | 0:13:49 | smithi | main | rhel | 8.6 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{rhel_8} mount/fuse objectstore/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/quota} | 2 | |
Failure Reason:
Test failure: test_disable_enable_human_readable_quota_values (tasks.cephfs.test_quota.TestQuota) |
||||||||||||||
pass | 7276619 | 2023-05-17 11:10:51 | 2023-05-17 13:28:13 | 2023-05-17 14:25:56 | 0:57:43 | 0:44:14 | 0:13:29 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/legacy wsync/yes} objectstore-ec/bluestore-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/default} standby-replay tasks/{0-subvolume/{with-quota} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/suites/fsstress}} | 3 | |
fail | 7276620 | 2023-05-17 11:10:52 | 2023-05-17 13:32:34 | 2023-05-17 20:08:40 | 6:36:06 | 6:25:53 | 0:10:13 | smithi | main | centos | 8.stream | fs/32bits/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{centos_8} mount/fuse objectstore-ec/bluestore-comp overrides/{faked-ino ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi045 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 7276621 | 2023-05-17 11:10:52 | 2023-05-17 13:33:15 | 2023-05-17 14:03:19 | 0:30:04 | 0:19:47 | 0:10:17 | smithi | main | ubuntu | 20.04 | fs/libcephfs/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-1-client-coloc conf/{client mds mon osd} distro/{ubuntu_20.04} objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/libcephfs_python} | 2 | |
Failure Reason:
Command failed (workunit test fs/test_python.sh) on smithi131 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/fs/test_python.sh' |
||||||||||||||
fail | 7276622 | 2023-05-17 11:10:53 | 2023-05-17 13:33:15 | 2023-05-17 20:10:03 | 6:36:48 | 6:25:37 | 0:11:11 | smithi | main | centos | 8.stream | fs/permission/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{centos_8} mount/fuse objectstore-ec/bluestore-comp overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_pjd} | 2 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi046 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
fail | 7276623 | 2023-05-17 11:10:54 | 2023-05-17 13:34:16 | 2023-05-17 20:31:02 | 6:56:46 | 6:44:44 | 0:12:02 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/fuse objectstore-ec/bluestore-comp-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/1 standby-replay tasks/{0-subvolume/{with-namespace-isolated} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/suites/pjd}} | 3 | |
Failure Reason:
Command failed (workunit test suites/pjd.sh) on smithi037 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' |
||||||||||||||
dead | 7276624 | 2023-05-17 11:10:54 | 2023-05-17 13:34:46 | 2023-05-18 01:48:04 | 12:13:18 | smithi | main | rhel | 8.6 | fs/verify/{begin/{0-install 1-ceph 2-logrotate} clusters/1a5s-mds-1c-client conf/{client mds mon osd} distro/{rhel_8} mount/fuse objectstore-ec/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down mon-debug session_timeout} ranks/5 tasks/fsstress validater/lockdep} | 2 | |||
Failure Reason:
hit max job timeout |
||||||||||||||
fail | 7276625 | 2023-05-17 11:10:55 | 2023-05-17 13:38:37 | 2023-05-17 15:20:04 | 1:41:27 | 1:26:15 | 0:15:12 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/legacy wsync/yes} objectstore-ec/bluestore-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/default} standby-replay tasks/{0-subvolume/{with-no-extra-options} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/fs/misc}} | 3 | |
Failure Reason:
error during scrub thrashing: reached maximum tries (30) after waiting for 900 seconds |
||||||||||||||
fail | 7276626 | 2023-05-17 11:10:56 | 2023-05-17 13:44:49 | 2023-05-17 20:40:59 | 6:56:10 | 6:45:39 | 0:10:31 | smithi | main | ubuntu | 22.04 | fs/traceless/{begin/{0-install 1-ceph 2-logrotate} clusters/fixed-2-ucephfs conf/{client mds mon osd} distro/{ubuntu_latest} mount/fuse objectstore-ec/bluestore-comp-ec-root overrides/{frag ignorelist_health ignorelist_wrongly_marked_down} tasks/cfuse_workunit_suites_fsstress traceless/50pc} | 2 | |
Failure Reason:
Command failed (workunit test suites/fsstress.sh) on smithi112 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh' |
||||||||||||||
pass | 7276627 | 2023-05-17 11:10:56 | 2023-05-17 13:44:50 | 2023-05-17 14:34:29 | 0:49:39 | 0:43:30 | 0:06:09 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/secure wsync/no} objectstore-ec/bluestore-bitmap omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated-and-quota} 1-check-counter 2-scrub/no 3-snaps/no 4-flush/yes 5-workunit/kernel_untar_build}} | 3 | |
fail | 7276628 | 2023-05-17 11:10:57 | 2023-05-17 13:45:10 | 2023-05-17 14:28:44 | 0:43:34 | 0:23:07 | 0:20:27 | smithi | main | ubuntu | 22.04 | fs/multiclient/{begin/{0-install 1-ceph 2-logrotate} clusters/1-mds-2-client conf/{client mds mon osd} distros/ubuntu_latest mount/fuse objectstore-ec/bluestore-ec-root overrides/{ignorelist_health ignorelist_wrongly_marked_down} tasks/mdtest} | 4 | |
Failure Reason:
Command failed on smithi038 with status 2: 'TESTDIR=/home/ubuntu/cephtest bash -s' |
||||||||||||||
fail | 7276629 | 2023-05-17 11:10:58 | 2023-05-17 13:55:29 | 2023-05-17 15:56:46 | 2:01:17 | 1:50:45 | 0:10:32 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/fuse objectstore-ec/bluestore-comp-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/default} standby-replay tasks/{0-subvolume/{with-namespace-isolated-and-quota} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/postgres}} | 3 | |
Failure Reason:
error during scrub thrashing: reached maximum tries (30) after waiting for 900 seconds |
||||||||||||||
fail | 7276630 | 2023-05-17 11:10:59 | 2023-05-17 13:55:29 | 2023-05-17 14:56:38 | 1:01:09 | 0:51:48 | 0:09:21 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} ms_mode/legacy wsync/yes} objectstore-ec/bluestore-ec-root omap_limit/10000 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/3 replication/always} standby-replay tasks/{0-subvolume/{with-namespace-isolated-and-quota} 1-check-counter 2-scrub/yes 3-snaps/yes 4-flush/no 5-workunit/suites/dbench}} | 3 | |
Failure Reason:
Command failed (workunit test suites/dbench.sh) on smithi115 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=27be51263d3d0c721b6d1ca9f42550bf0ab3c97b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/dbench.sh' |
||||||||||||||
fail | 7276631 | 2023-05-17 11:10:59 | 2023-05-17 13:55:29 | 2023-05-17 23:40:11 | 9:44:42 | 9:31:19 | 0:13:23 | smithi | main | ubuntu | 20.04 | fs/functional/{begin/{0-install 1-ceph 2-logrotate} clusters/1a3s-mds-4c-client conf/{client mds mon osd} distro/{ubuntu_20.04} mount/kclient/{mount-syntax/{v2} mount overrides/{distro/testing/k-testing ms-die-on-skipped}} objectstore/bluestore-bitmap overrides/{ignorelist_health ignorelist_wrongly_marked_down no_client_pidfile} tasks/xfstests-dev} | 2 | |
Failure Reason:
Test failure: test_generic (tasks.cephfs.tests_from_xfstests_dev.TestXFSTestsDev) |
||||||||||||||
pass | 7276632 | 2023-05-17 11:11:00 | 2023-05-17 14:03:29 | 2023-05-17 14:58:58 | 0:55:29 | 0:49:31 | 0:05:58 | smithi | main | rhel | 8.6 | fs/workload/{0-rhel_8 begin/{0-install 1-cephadm 2-logrotate} clusters/1a11s-mds-1c-client-3node conf/{client mds mon osd} mount/kclient/{base/{mount-syntax/{v2} mount overrides/{distro/stock/{k-stock rhel_8} ms-die-on-skipped}} ms_mode/crc wsync/no} objectstore-ec/bluestore-comp omap_limit/10 overrides/{cephsqlite-timeout frag ignorelist_health ignorelist_wrongly_marked_down osd-asserts session_timeout} ranks/multi/{export-check n/5 replication/always} standby-replay tasks/{0-subvolume/{no-subvolume} 1-check-counter 2-scrub/no 3-snaps/no 4-flush/yes 5-workunit/suites/iogen}} | 3 |