Not logged in.
Quick Search - Contribution
Contribution Details
Type | Journal Article |
Scope | Discipline-based scholarship |
Title | Learning Depth With Very Sparse Supervision |
Organization Unit | |
Authors |
|
Item Subtype | Original Work |
Refereed | Yes |
Status | Published in final form |
Language |
|
Journal Title | IEEE Robotics and Automation Letters |
Publisher | Institute of Electrical and Electronics Engineers |
Geographical Reach | international |
ISSN | 2377-3766 |
Volume | 5 |
Number | 4 |
Page Range | 5542 - 5549 |
Date | 2020 |
Abstract Text | Motivated by the astonishing capabilities of natural intelligent agents and inspired by theories from psychology, this paper explores the idea that perception gets coupled to 3D properties of the world via interaction with the environment. Existing works for depth estimation require either massive amounts of annotated training data or some form of hard-coded geometrical constraint. This paper explores a new approach to learning depth perception requiring neither of those. Specifically, we propose a novel global-local network architecture that can be trained with the data observed by a robot exploring an environment: images and extremely sparse depth measurements, down to even a single pixel per image. From a pair of consecutive images, the proposed network outputs a latent representation of the camera's and scene's parameters, and a dense depth map. Experiments on several datasets show that, when ground truth is available even for just one of the image pixels, the proposed network can learn monocular dense depth estimation up to 22.5% more accurately than state-of-the-art approaches. We believe that this work, in addition to its scientific interest, lays the foundations to learn depth with extremely sparse supervision, which can be valuable to all robotic systems acting under severe bandwidth or sensing constraints. |
Related URLs | |
Digital Object Identifier | 10.1109/lra.2020.3009067 |
Other Identification Number | merlin-id:20319 |
PDF File | Download from ZORA |
Export |
BibTeX
EP3 XML (ZORA) |