Status Job ID Links Posted Started Updated
Runtime
Duration
In Waiting
Machine Teuthology Branch OS Type OS Version Description Nodes
fail 3458027 2019-01-13 05:02:17 2019-01-13 05:02:19 2019-01-13 06:00:19 0:58:00 0:11:21 0:46:39 ovh master ubuntu 16.04 smoke/1node/{clusters/{fixed-1.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/ceph-deploy.yaml} 1
Failure Reason:

failed during ceph-deploy cmd: disk zap ovh053:/dev/sdb , ec=1

pass 3458028 2019-01-13 05:02:18 2019-01-13 05:02:19 2019-01-13 06:06:19 1:04:00 0:24:45 0:39:15 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_blogbench.yaml} 3
fail 3458029 2019-01-13 05:02:19 2019-01-13 05:02:20 2019-01-13 07:44:21 2:42:01 0:20:50 2:21:11 ovh master centos 7.5 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/centos_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
Failure Reason:

ceph-deploy: Failed to zap osds

pass 3458030 2019-01-13 05:02:19 2019-01-13 05:02:20 2019-01-13 06:32:21 1:30:01 0:31:52 0:58:09 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_fsstress.yaml} 3
fail 3458031 2019-01-13 05:02:20 2019-01-13 05:02:21 2019-01-13 06:08:21 1:06:00 0:09:54 0:56:06 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_iozone.yaml} 3
Failure Reason:

{'ovh009.front.sepia.ceph.com': {'_ansible_parsed': True, 'stderr_lines': ['E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)', 'E: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?'], 'changed': False, '_ansible_no_log': False, 'stdout': '', 'cache_updated': False, 'invocation': {'module_args': {'autoremove': False, 'force': False, 'force_apt_get': False, 'update_cache': None, 'only_upgrade': False, 'deb': None, 'cache_valid_time': 0, 'dpkg_options': 'force-confdef,force-confold', 'upgrade': None, 'name': 'perl-doc', 'package': ['perl-doc'], 'autoclean': False, 'purge': False, 'allow_unauthenticated': False, 'state': 'present', 'default_release': None, 'install_recommends': None}}, 'stderr': 'E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)\nE: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?\n', 'rc': 100, 'msg': '\'/usr/bin/apt-get -y -o "Dpkg::Options::=--force-confdef" -o "Dpkg::Options::=--force-confold" install \'perl-doc\'\' failed: E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)\nE: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?\n', 'stdout_lines': [], 'cache_update_time': 1547359281}}

pass 3458032 2019-01-13 05:02:21 2019-01-13 05:02:22 2019-01-13 06:24:22 1:22:00 0:19:18 1:02:42 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/cfuse_workunit_suites_pjd.yaml} 3
pass 3458033 2019-01-13 05:02:21 2019-01-13 05:02:23 2019-01-13 06:32:23 1:30:00 0:19:06 1:10:54 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_direct_io.yaml} 3
pass 3458034 2019-01-13 05:02:22 2019-01-13 05:02:23 2019-01-13 06:42:24 1:40:01 0:46:52 0:53:09 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_dbench.yaml} 3
pass 3458035 2019-01-13 05:02:23 2019-01-13 05:02:24 2019-01-13 06:32:25 1:30:01 0:33:21 0:56:40 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_fsstress.yaml} 3
pass 3458036 2019-01-13 05:02:24 2019-01-13 05:02:25 2019-01-13 06:08:25 1:06:00 0:18:27 0:47:33 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/kclient_workunit_suites_pjd.yaml} 3
pass 3458037 2019-01-13 05:02:24 2019-01-13 05:02:25 2019-01-13 06:02:25 1:00:00 0:22:22 0:37:38 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/libcephfs_interface_tests.yaml} 3
fail 3458038 2019-01-13 05:02:25 2019-01-13 05:02:26 2019-01-13 06:12:26 1:10:00 0:10:17 0:59:43 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/mon_thrash.yaml} 3
Failure Reason:

{'ovh081.front.sepia.ceph.com': {'_ansible_parsed': True, 'stderr_lines': ['E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)', 'E: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?'], 'changed': False, '_ansible_no_log': False, 'stdout': '', 'cache_updated': False, 'invocation': {'module_args': {'autoremove': False, 'force': False, 'force_apt_get': False, 'update_cache': None, 'only_upgrade': False, 'deb': None, 'cache_valid_time': 0, 'dpkg_options': 'force-confdef,force-confold', 'upgrade': None, 'name': 'ntp', 'package': ['ntp'], 'autoclean': False, 'purge': False, 'allow_unauthenticated': False, 'state': 'present', 'default_release': None, 'install_recommends': None}}, 'stderr': 'E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)\nE: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?\n', 'rc': 100, 'msg': '\'/usr/bin/apt-get -y -o "Dpkg::Options::=--force-confdef" -o "Dpkg::Options::=--force-confold" install \'ntp\'\' failed: E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)\nE: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?\n', 'stdout_lines': [], 'cache_update_time': 1547358850}}

pass 3458039 2019-01-13 05:02:26 2019-01-13 05:02:27 2019-01-13 06:28:27 1:26:00 0:35:52 0:50:08 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_api_tests.yaml} 3
pass 3458040 2019-01-13 05:02:26 2019-01-13 05:02:28 2019-01-13 06:46:28 1:44:00 0:56:22 0:47:38 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_bench.yaml} 3
pass 3458041 2019-01-13 05:02:27 2019-01-13 05:02:28 2019-01-13 06:48:29 1:46:01 0:59:41 0:46:20 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cache_snaps.yaml} 3
fail 3458042 2019-01-13 05:02:28 2019-01-13 05:02:29 2019-01-13 06:54:30 1:52:01 0:17:58 1:34:03 ovh master ubuntu 16.04 smoke/systemd/{clusters/{fixed-4.yaml openstack.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml tasks/systemd.yaml} 4
Failure Reason:

ceph-deploy: Failed to zap osds

pass 3458043 2019-01-13 05:02:28 2019-01-13 05:02:29 2019-01-13 06:14:30 1:12:01 0:21:37 0:50:24 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_cls_all.yaml} 3
pass 3458044 2019-01-13 05:02:29 2019-01-13 05:30:29 2019-01-13 06:56:29 1:26:00 0:28:41 0:57:19 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_ec_snaps.yaml} 3
pass 3458045 2019-01-13 05:02:30 2019-01-13 05:32:31 2019-01-13 06:32:31 1:00:00 0:21:45 0:38:15 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_python.yaml} 3
pass 3458046 2019-01-13 05:02:31 2019-01-13 05:36:29 2019-01-13 06:40:29 1:04:00 0:32:13 0:31:47 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rados_workunit_loadgen_mix.yaml} 3
pass 3458047 2019-01-13 05:02:31 2019-01-13 05:36:56 2019-01-13 07:16:57 1:40:01 0:54:35 0:45:26 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_api_tests.yaml} 3
pass 3458048 2019-01-13 05:02:32 2019-01-13 05:38:47 2019-01-13 06:46:47 1:08:00 0:18:19 0:49:41 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_cli_import_export.yaml} 3
pass 3458049 2019-01-13 05:02:33 2019-01-13 05:42:47 2019-01-13 06:54:47 1:12:00 0:21:13 0:50:47 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_fsx.yaml} 3
pass 3458050 2019-01-13 05:02:33 2019-01-13 05:42:47 2019-01-13 07:04:47 1:22:00 0:27:53 0:54:07 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_python_api_tests.yaml} 3
pass 3458051 2019-01-13 05:02:34 2019-01-13 05:44:28 2019-01-13 07:48:29 2:04:01 1:32:39 0:31:22 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rbd_workunit_suites_iozone.yaml} 3
fail 3458052 2019-01-13 05:02:35 2019-01-13 05:44:32 2019-01-13 07:08:33 1:24:01 1:05:57 0:18:04 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_ec_s3tests.yaml} 3
Failure Reason:

"2019-01-13 06:47:41.424798 mon.a (mon.0) 176 : cluster [WRN] overall HEALTH_WARN Degraded data redundancy: 539/4440 objects degraded (12.140%), 49 pgs degraded" in cluster log

pass 3458053 2019-01-13 05:02:36 2019-01-13 05:46:47 2019-01-13 07:18:48 1:32:01 1:07:56 0:24:05 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_s3tests.yaml} 3
pass 3458054 2019-01-13 05:02:36 2019-01-13 05:46:47 2019-01-13 06:32:47 0:46:00 0:21:49 0:24:11 ovh master smoke/basic/{clusters/{fixed-3-cephfs.yaml openstack.yaml} objectstore/bluestore-bitmap.yaml tasks/rgw_swift.yaml} 3