Toggle Main Menu Toggle Search

Open Access padlockePrints

Real-Time Monocular Depth Estimation Using Synthetic Data with Domain Adaptation via Image Style Transfer

Lookup NU author(s): Dr Amir Atapour AbarghoueiORCiD

Downloads


Licence

This is the authors' accepted manuscript of a conference proceedings (inc. abstract) that has been published in its final definitive form by IEEE, 2018.

For re-use rights please refer to the publisher's terms and conditions.


Abstract

Monocular depth estimation using learning-based approaches has become promising in recent years. However, most monocular depth estimators either need to rely on large quantities of ground truth depth data, which is extremely expensive and difficult to obtain, or predict disparity as an intermediary step using a secondary supervisory signal leading to blurring and other artefacts. Training a depth estimation model using pixel-perfect synthetic data can resolve most of these issues but introduces the problem of domain bias. This is the inability to apply a model trained on synthetic data to real-world scenarios. With advances in image style transfer and its connections with domain adaptation (Maximum Mean Discrepancy), we take advantage of style transfer and adversarial training to predict pixel perfect depth from a single real-world color image based on training over a large corpus of synthetic environment data. Experimental results indicate the efficacy of our approach compared to contemporary state-of-the-art techniques.


Publication metadata

Author(s): Atapour-Abarghouei A, Breckon TP

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition

Year of Conference: 2018

Pages: 2800-2810

Online publication date: 17/12/2018

Acceptance date: 19/02/2018

Date deposited: 06/02/2021

ISSN: 2575-7075

Publisher: IEEE

URL: https://doi.org/10.1109/CVPR.2018.00296

DOI: 10.1109/CVPR.2018.00296

Library holdings: Search Newcastle University Library for this item

ISBN: 9781538664209


Share