User | Scheduled | Started | Updated | Runtime | Suite | Branch | Machine Type | Revision | Pass | Fail |
---|---|---|---|---|---|---|---|---|---|---|
yuriw | 2018-06-24 20:57:32 | 2018-06-24 20:57:40 | 2018-06-25 02:54:28 | 5:56:48 | ceph-deploy | wip-luminous-20180622 | mira | 451f3cf | 36 | 16 |
Status | Job ID | Links | Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Description | Nodes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pass | 2700838 | 2018-06-24 20:57:36 | 2018-06-24 20:57:37 | 2018-06-24 21:23:37 | 0:26:00 | 0:13:49 | 0:12:11 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700839 | 2018-06-24 20:57:37 | 2018-06-24 20:57:38 | 2018-06-24 21:35:38 | 0:38:00 | 0:18:52 | 0:19:08 | mira | master | centos | 7.4 | ceph-deploy/ceph-volume/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/centos_latest.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
SELinux denials found on ubuntu@mira067.front.sepia.ceph.com: ['type=AVC msg=audit(1529875905.794:4555): avc: denied { read } for pid=28560 comm="ceph-osd" name="block" dev="tmpfs" ino=84502 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:user_tmp_t:s0 tclass=lnk_file', 'type=AVC msg=audit(1529875905.880:4556): avc: denied { read } for pid=28560 comm="ceph-osd" name="/" dev="tmpfs" ino=88172 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:user_tmp_t:s0 tclass=dir', 'type=AVC msg=audit(1529875893.435:4497): avc: denied { read } for pid=28016 comm="ceph-osd" name="block" dev="tmpfs" ino=84075 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:user_tmp_t:s0 tclass=lnk_file', 'type=AVC msg=audit(1529875893.513:4498): avc: denied { read } for pid=28016 comm="ceph-osd" name="/" dev="tmpfs" ino=82864 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:user_tmp_t:s0 tclass=dir'] |
||||||||||||||
pass | 2700840 | 2018-06-24 20:57:38 | 2018-06-24 20:57:39 | 2018-06-24 21:15:38 | 0:17:59 | 0:07:52 | 0:10:07 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700841 | 2018-06-24 20:57:39 | 2018-06-24 20:57:40 | 2018-06-25 00:15:43 | 3:18:03 | 3:07:34 | 0:10:29 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira011 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700842 | 2018-06-24 20:57:40 | 2018-06-24 20:57:41 | 2018-06-24 21:25:40 | 0:27:59 | 0:14:57 | 0:13:02 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700843 | 2018-06-24 20:57:41 | 2018-06-24 20:57:42 | 2018-06-24 21:17:41 | 0:19:59 | 0:08:04 | 0:11:55 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700844 | 2018-06-24 20:57:42 | 2018-06-24 20:57:43 | 2018-06-24 21:17:42 | 0:19:59 | 0:08:50 | 0:11:09 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700845 | 2018-06-24 20:57:43 | 2018-06-24 20:57:44 | 2018-06-25 00:25:48 | 3:28:04 | 3:13:42 | 0:14:22 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira065 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700846 | 2018-06-24 20:57:44 | 2018-06-24 20:57:45 | 2018-06-24 21:17:45 | 0:20:00 | 0:08:21 | 0:11:39 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700847 | 2018-06-24 20:57:45 | 2018-06-24 20:57:46 | 2018-06-24 21:17:46 | 0:20:00 | 0:07:53 | 0:12:07 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700848 | 2018-06-24 20:57:46 | 2018-06-24 20:57:47 | 2018-06-24 21:27:47 | 0:30:00 | 0:16:22 | 0:13:38 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700849 | 2018-06-24 20:57:47 | 2018-06-24 20:57:48 | 2018-06-25 00:17:52 | 3:20:04 | 3:07:31 | 0:12:33 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira075 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700850 | 2018-06-24 20:57:48 | 2018-06-24 20:58:04 | 2018-06-24 21:36:02 | 0:37:58 | 0:08:00 | 0:29:58 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700851 | 2018-06-24 20:57:49 | 2018-06-24 20:58:05 | 2018-06-24 21:42:03 | 0:43:58 | 0:14:15 | 0:29:43 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700852 | 2018-06-24 20:57:50 | 2018-06-24 20:58:05 | 2018-06-25 00:38:08 | 3:40:03 | 0:09:31 | 3:30:32 | mira | master | ubuntu | 16.04 | ceph-deploy/ceph-volume/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/ubuntu_latest.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
ceph-deploy: Failed to create osds |
||||||||||||||
pass | 2700853 | 2018-06-24 20:57:51 | 2018-06-24 20:58:05 | 2018-06-24 21:42:03 | 0:43:58 | 0:07:09 | 0:36:49 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700854 | 2018-06-24 20:57:52 | 2018-06-24 20:58:05 | 2018-06-25 00:34:07 | 3:36:02 | 3:06:46 | 0:29:16 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira037 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700855 | 2018-06-24 20:57:53 | 2018-06-24 20:58:05 | 2018-06-24 21:42:03 | 0:43:58 | 0:14:06 | 0:29:52 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700856 | 2018-06-24 20:57:55 | 2018-06-24 20:58:05 | 2018-06-24 21:44:03 | 0:45:58 | 0:08:01 | 0:37:57 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700857 | 2018-06-24 20:57:55 | 2018-06-24 20:58:04 | 2018-06-24 21:32:02 | 0:33:58 | 0:06:42 | 0:27:16 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700858 | 2018-06-24 20:57:56 | 2018-06-24 21:15:54 | 2018-06-25 01:03:57 | 3:48:03 | 3:15:56 | 0:32:07 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira067 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700859 | 2018-06-24 20:57:57 | 2018-06-24 21:17:59 | 2018-06-24 21:53:57 | 0:35:58 | 0:07:48 | 0:28:10 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700860 | 2018-06-24 20:57:58 | 2018-06-24 21:18:00 | 2018-06-24 22:31:59 | 1:13:59 | 0:07:24 | 1:06:35 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700861 | 2018-06-24 20:57:59 | 2018-06-24 21:17:59 | 2018-06-24 22:05:58 | 0:47:59 | 0:13:21 | 0:34:38 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700862 | 2018-06-24 20:58:00 | 2018-06-24 21:17:59 | 2018-06-25 01:00:02 | 3:42:03 | 3:07:05 | 0:34:58 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira001 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700863 | 2018-06-24 20:58:01 | 2018-06-24 21:23:38 | 2018-06-24 21:49:38 | 0:26:00 | 0:07:41 | 0:18:19 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700864 | 2018-06-24 20:58:03 | 2018-06-24 21:25:56 | 2018-06-24 21:53:55 | 0:27:59 | 0:14:08 | 0:13:51 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700865 | 2018-06-24 20:58:04 | 2018-06-24 21:27:49 | 2018-06-25 01:01:52 | 3:34:03 | 0:20:22 | 3:13:41 | mira | master | centos | 7.4 | ceph-deploy/ceph-volume/{cluster/4node.yaml config/ceph_volume_filestore.yaml distros/centos_latest.yaml tasks/rbd_import_export.yaml} | 4 | |
Failure Reason:
SELinux denials found on ubuntu@mira065.front.sepia.ceph.com: ['type=AVC msg=audit(1529888152.868:4453): avc: denied { remove_name } for pid=27989 comm="ceph-osd" name="000008.dbtmp" dev="dm-0" ino=268435500 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888152.861:4448): avc: denied { setattr } for pid=27989 comm="ceph-osd" name="xattr_test" dev="dm-0" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.861:4447): avc: denied { unlink } for pid=27989 comm="ceph-osd" name="fiemap_test" dev="dm-0" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.786:4440): avc: denied { open } for pid=27989 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-0/type" dev="dm-0" ino=42 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.786:4440): avc: denied { read } for pid=27989 comm="ceph-osd" name="type" dev="dm-0" ino=42 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888174.126:4515): avc: denied { open } for pid=28762 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-1/type" dev="dm-1" ino=42 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888161.264:4510): avc: denied { write } for pid=27989 comm="tp_fstore_op" name="meta" dev="dm-0" ino=537288256 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.220:4526): avc: denied { getattr } for pid=28762 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-1/current/omap/osd_uuid" dev="dm-1" ino=268435496 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888174.218:4522): avc: denied { unlink } for pid=28762 comm="ceph-osd" name="fiemap_test" dev="dm-1" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.859:4443): avc: denied { write } for pid=27989 comm="ceph-osd" name="fsid" dev="dm-0" ino=37 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.868:4453): avc: denied { unlink } for pid=27989 comm="ceph-osd" name="CURRENT" dev="dm-0" ino=268435498 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.859:4445): avc: denied { write } for pid=27989 comm="ceph-osd" name="/" dev="dm-0" ino=32 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888287.329:4649): avc: denied { read } for pid=28762 comm="tp_fstore_op" name="1.3_head" dev="dm-1" ino=805306400 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888152.859:4444): avc: denied { lock } for pid=27989 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-0/fsid" dev="dm-0" ino=37 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.865:4452): avc: denied { add_name } for pid=27989 comm="ceph-osd" name="000007.sst" scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888152.858:4441): avc: denied { read } for pid=27989 comm="ceph-osd" name="/" dev="dm-0" ino=32 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888152.861:4449): avc: denied { getattr } for pid=27989 comm="ceph-osd" name="xattr_test" dev="dm-0" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.868:4453): avc: denied { rename } for pid=27989 comm="ceph-osd" name="000008.dbtmp" dev="dm-0" ino=268435500 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.861:4446): avc: denied { read write open } for pid=27989 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-0/fiemap_test" dev="dm-0" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.861:4447): avc: denied { remove_name } for pid=27989 comm="ceph-osd" name="fiemap_test" dev="dm-0" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888287.331:4650): avc: denied { add_name } for pid=28762 comm="tp_fstore_op" name="__head_00000003__1" scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.219:4523): avc: denied { setattr } for pid=28762 comm="ceph-osd" name="xattr_test" dev="dm-1" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888174.219:4524): avc: denied { getattr } for pid=28762 comm="ceph-osd" name="xattr_test" dev="dm-1" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888174.226:4528): avc: denied { remove_name } for pid=28762 comm="ceph-osd" name="000008.dbtmp" dev="dm-1" ino=268435500 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888161.264:4510): avc: denied { create } for pid=27989 comm="tp_fstore_op" name="inc\\uosdmap.6__0_B65F4796__none" scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.861:4446): avc: denied { add_name } for pid=27989 comm="ceph-osd" name="fiemap_test" scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.215:4517): avc: denied { read } for pid=28762 comm="ceph-osd" name="journal" dev="dm-1" ino=35 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=lnk_file', 'type=AVC msg=audit(1529888174.216:4519): avc: denied { lock } for pid=28762 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-1/fsid" dev="dm-1" ino=37 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.863:4451): avc: denied { getattr } for pid=27989 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-0/current/omap/osd_uuid" dev="dm-0" ino=268435496 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888287.323:4647): avc: denied { create } for pid=27989 comm="tp_fstore_op" name="1.11_head" scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888152.872:4454): avc: denied { setattr } for pid=27989 comm="tp_fstore_op" name="meta" dev="dm-0" ino=537288256 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.217:4521): avc: denied { create } for pid=28762 comm="ceph-osd" name="fiemap_test" scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888174.226:4528): avc: denied { rename } for pid=28762 comm="ceph-osd" name="000008.dbtmp" dev="dm-1" ino=268435500 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888161.264:4510): avc: denied { add_name } for pid=27989 comm="tp_fstore_op" name="inc\\uosdmap.6__0_B65F4796__none" scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888299.844:4651): avc: denied { remove_name } for pid=28762 comm="tp_fstore_op" name="rbd\\udata.107a6b8b4567.0000000000000000__head_3ACE7C5C__1" dev="dm-1" ino=268435650 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.217:4521): avc: denied { read write open } for pid=28762 comm="ceph-osd" path="/var/lib/ceph/osd/ceph-1/fiemap_test" dev="dm-1" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.865:4452): avc: denied { write } for pid=27989 comm="ceph-osd" name="omap" dev="dm-0" ino=268435488 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888161.264:4512): avc: denied { setattr } for pid=27989 comm="tp_fstore_op" name="meta" dev="dm-0" ino=537288256 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.215:4516): avc: denied { read } for pid=28762 comm="ceph-osd" name="/" dev="dm-1" ino=32 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888152.858:4442): avc: denied { read } for pid=27989 comm="ceph-osd" name="journal" dev="dm-0" ino=35 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=lnk_file', 'type=AVC msg=audit(1529888174.230:4529): avc: denied { setattr } for pid=28762 comm="tp_fstore_op" name="meta" dev="dm-1" ino=537288256 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.223:4527): avc: denied { add_name } for pid=28762 comm="ceph-osd" name="000007.sst" scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888161.264:4513): avc: denied { setattr } for pid=27989 comm="tp_fstore_op" name="inc\\uosdmap.6__0_B65F4796__none" dev="dm-0" ino=537288269 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888174.217:4521): avc: denied { add_name } for pid=28762 comm="ceph-osd" name="fiemap_test" scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.226:4528): avc: denied { unlink } for pid=28762 comm="ceph-osd" name="CURRENT" dev="dm-1" ino=268435498 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.863:4450): avc: denied { read } for pid=27989 comm="ceph-osd" name="current" dev="dm-0" ino=40 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.217:4520): avc: denied { write } for pid=28762 comm="ceph-osd" name="/" dev="dm-1" ino=32 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.223:4527): avc: denied { write } for pid=28762 comm="ceph-osd" name="omap" dev="dm-1" ino=268435488 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.219:4525): avc: denied { read } for pid=28762 comm="ceph-osd" name="current" dev="dm-1" ino=40 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888161.264:4511): avc: denied { read } for pid=27989 comm="tp_fstore_op" name="meta" dev="dm-0" ino=537288256 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888287.329:4648): avc: denied { setattr } for pid=28762 comm="tp_fstore_op" name="1.3_head" dev="dm-1" ino=805306400 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.216:4518): avc: denied { write } for pid=28762 comm="ceph-osd" name="fsid" dev="dm-1" ino=37 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888152.861:4446): avc: denied { create } for pid=27989 comm="ceph-osd" name="fiemap_test" scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888174.218:4522): avc: denied { remove_name } for pid=28762 comm="ceph-osd" name="fiemap_test" dev="dm-1" ino=49 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888287.329:4648): avc: denied { write } for pid=28762 comm="tp_fstore_op" name="1.3_head" dev="dm-1" ino=805306400 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=dir', 'type=AVC msg=audit(1529888174.126:4515): avc: denied { read } for pid=28762 comm="ceph-osd" name="type" dev="dm-1" ino=42 scontext=system_u:system_r:ceph_t:s0 tcontext=unconfined_u:object_r:unlabeled_t:s0 tclass=file', 'type=AVC msg=audit(1529888161.264:4510): avc: denied { read write open } for pid=27989 comm="tp_fstore_op" path="/var/lib/ceph/osd/ceph-0/current/meta/inc\\uosdmap.6__0_B65F4796__none" dev="dm-0" ino=537288269 scontext=system_u:system_r:ceph_t:s0 tcontext=system_u:object_r:unlabeled_t:s0 tclass=file'] |
||||||||||||||
fail | 2700866 | 2018-06-24 20:58:06 | 2018-06-24 21:32:16 | 2018-06-25 01:02:20 | 3:30:04 | 3:07:36 | 0:22:28 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira053 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700867 | 2018-06-24 20:58:07 | 2018-06-24 21:35:52 | 2018-06-24 22:01:51 | 0:25:59 | 0:07:19 | 0:18:40 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700868 | 2018-06-24 20:58:07 | 2018-06-24 21:36:04 | 2018-06-24 22:12:04 | 0:36:00 | 0:17:00 | 0:19:00 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700869 | 2018-06-24 20:58:08 | 2018-06-24 21:42:12 | 2018-06-24 22:14:12 | 0:32:00 | 0:10:26 | 0:21:34 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700870 | 2018-06-24 20:58:09 | 2018-06-24 21:42:13 | 2018-06-25 01:20:16 | 3:38:03 | 3:08:21 | 0:29:42 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira012 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
fail | 2700871 | 2018-06-24 20:58:10 | 2018-06-24 21:42:13 | 2018-06-24 22:38:13 | 0:56:00 | 0:12:16 | 0:43:44 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed on mira002 with status 1: 'sudo tar cz -f /tmp/tmpWYVBG4 -C /var/lib/ceph/mon -- .' |
||||||||||||||
pass | 2700872 | 2018-06-24 20:58:11 | 2018-06-24 21:44:09 | 2018-06-24 22:08:08 | 0:23:59 | 0:08:05 | 0:15:54 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700873 | 2018-06-24 20:58:12 | 2018-06-24 21:49:40 | 2018-06-24 22:13:39 | 0:23:59 | 0:09:31 | 0:14:28 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700874 | 2018-06-24 20:58:13 | 2018-06-24 21:54:10 | 2018-06-25 01:34:13 | 3:40:03 | 3:13:42 | 0:26:21 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira081 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700875 | 2018-06-24 20:58:14 | 2018-06-24 21:54:10 | 2018-06-24 22:30:10 | 0:36:00 | 0:07:16 | 0:28:44 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700876 | 2018-06-24 20:58:15 | 2018-06-24 22:02:15 | 2018-06-24 22:24:13 | 0:21:58 | 0:07:29 | 0:14:29 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700877 | 2018-06-24 20:58:16 | 2018-06-24 22:06:18 | 2018-06-24 23:14:18 | 1:08:00 | 0:14:28 | 0:53:32 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700878 | 2018-06-24 20:58:17 | 2018-06-24 22:08:23 | 2018-06-25 01:02:26 | 2:54:03 | 0:10:03 | 2:44:00 | mira | master | ubuntu | 16.04 | ceph-deploy/ceph-volume/{cluster/4node.yaml config/ceph_volume_dmcrypt_off.yaml distros/ubuntu_latest.yaml tasks/rbd_import_export.yaml} | 4 | |
fail | 2700879 | 2018-06-24 20:58:18 | 2018-06-24 22:12:08 | 2018-06-25 01:50:11 | 3:38:03 | 3:07:57 | 0:30:06 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira084 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700880 | 2018-06-24 20:58:18 | 2018-06-24 22:13:51 | 2018-06-24 22:47:51 | 0:34:00 | 0:06:53 | 0:27:07 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700881 | 2018-06-24 20:58:19 | 2018-06-24 22:14:13 | 2018-06-24 22:54:13 | 0:40:00 | 0:14:03 | 0:25:57 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700882 | 2018-06-24 20:58:20 | 2018-06-24 22:24:33 | 2018-06-24 23:18:32 | 0:53:59 | 0:07:43 | 0:46:16 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700883 | 2018-06-24 20:58:21 | 2018-06-24 22:30:24 | 2018-06-25 02:54:28 | 4:24:04 | 3:07:10 | 1:16:54 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira002 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700884 | 2018-06-24 20:58:22 | 2018-06-24 22:32:16 | 2018-06-24 23:02:15 | 0:29:59 | 0:13:11 | 0:16:48 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700885 | 2018-06-24 20:58:23 | 2018-06-24 22:38:26 | 2018-06-24 23:48:27 | 1:10:01 | 0:08:10 | 1:01:51 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700886 | 2018-06-24 20:58:25 | 2018-06-24 22:47:53 | 2018-06-24 23:11:53 | 0:24:00 | 0:08:04 | 0:15:56 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/ceph_deploy_dmcrypt.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
fail | 2700887 | 2018-06-24 20:58:25 | 2018-06-24 22:54:29 | 2018-06-25 02:38:33 | 3:44:04 | 3:13:07 | 0:30:57 | mira | master | centos | 7.4 | ceph-deploy/basic/{ceph-deploy-overrides/disable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/centos_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 | |
Failure Reason:
Command failed (workunit test ceph-tests/ceph-admin-commands.sh) on mira112 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=wip-luminous-20180622 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/ceph-tests/ceph-admin-commands.sh' |
||||||||||||||
pass | 2700888 | 2018-06-24 20:58:31 | 2018-06-24 23:02:34 | 2018-06-24 23:28:33 | 0:25:59 | 0:07:18 | 0:18:41 | mira | master | ubuntu | 14.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_14.04.yaml objectstore/filestore-xfs.yaml python_versions/python_3.yaml tasks/ceph-admin-commands.yaml} | 2 | |
pass | 2700889 | 2018-06-24 20:58:32 | 2018-06-24 23:12:10 | 2018-06-24 23:36:10 | 0:24:00 | 0:07:42 | 0:16:18 | mira | master | ubuntu | 16.04 | ceph-deploy/basic/{ceph-deploy-overrides/enable_dmcrypt_diff_journal_disk.yaml config_options/cephdeploy_conf.yaml distros/ubuntu_latest.yaml objectstore/bluestore.yaml python_versions/python_2.yaml tasks/ceph-admin-commands.yaml} | 2 |