Posted | Started | Updated | Runtime |
Duration |
In Waiting |
Machine | Teuthology Branch | OS Type | OS Version | Nodes | Status |
---|---|---|---|---|---|---|---|---|---|---|---|
2017-10-22 02:25:43 | 2017-10-22 02:25:45 | 2017-10-22 02:49:44 | 0:23:59 | 0:12:20 | 0:11:39 | ovh | master | ubuntu | 16.04 | 3 | fail |
Description: upgrade:luminous-x/stress-split/{0-cluster/{openstack.yaml start.yaml} 1-ceph-install/luminous.yaml 2-partial-upgrade/firsthalf.yaml 3-thrash/default.yaml 4-workload/{radosbench.yaml rbd-cls.yaml rbd-import-export.yaml rbd_api.yaml readwrite.yaml snaps-few-objects.yaml} 5-finish-upgrade.yaml 7-final-workload/{rbd-python.yaml rgw-swift.yaml snaps-many-objects.yaml} distros/ubuntu_latest.yaml objectstore/filestore-xfs.yaml thrashosds-health.yaml}
Sentry event: http://sentry.ceph.com/sepia/teuthology/?q=60932c4b8a9a46158fea7ed6c9ed9383
Command failed on ovh026 with status 100: u'sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=13.0.0-2235-g6a35e37-1xenial ceph-mds=13.0.0-2235-g6a35e37-1xenial ceph-mgr=13.0.0-2235-g6a35e37-1xenial ceph-common=13.0.0-2235-g6a35e37-1xenial ceph-fuse=13.0.0-2235-g6a35e37-1xenial ceph-test=13.0.0-2235-g6a35e37-1xenial radosgw=13.0.0-2235-g6a35e37-1xenial python-ceph=13.0.0-2235-g6a35e37-1xenial libcephfs2=13.0.0-2235-g6a35e37-1xenial libcephfs-dev=13.0.0-2235-g6a35e37-1xenial libcephfs-java=13.0.0-2235-g6a35e37-1xenial libcephfs-jni=13.0.0-2235-g6a35e37-1xenial librados2=13.0.0-2235-g6a35e37-1xenial librbd1=13.0.0-2235-g6a35e37-1xenial rbd-fuse=13.0.0-2235-g6a35e37-1xenial'