Not logged in.

Contribution Details

Type Conference or Workshop Paper
Scope Discipline-based scholarship
Published in Proceedings Yes
Title Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization
Organization Unit
Authors
  • Samuel Bryner
  • Guillermo Gallego
  • Henri Rebecq
  • Davide Scaramuzza
Presentation Type paper
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
ISBN 978-1-5386-6027-0
Page Range 325 - 331
Event Title 2019 International Conference on Robotics and Automation (ICRA)
Event Type conference
Event Location Montreal, QC, Canada
Event Start Date June 20 - 2019
Event End Date June 24 - 2019
Publisher IEEE
Abstract Text Event cameras are novel bio-inspired vision sensors that output pixel-level intensity changes, called “events”, instead of traditional video images. These asynchronous sensors naturally respond to motion in the scene with very low latency (microseconds) and have a very high dynamic range. These features, along with a very low power consumption, make event cameras an ideal sensor for fast robot localization and wearable applications, such as AR/VR and gaming. Considering these applications, we present a method to track the 6-DOF pose of an event camera in a known environment, which we contemplate to be described by a photometric 3D map (i.e., intensity plus depth information) built via classic dense 3D reconstruction algorithms. Our approach uses the raw events, directly, without intermediate features, within a maximum-likelihood framework to estimate the camera motion that best explains the events via a generative model. We successfully evaluate the method using both simulated and real data, and show improved results over the state of the art. We release the datasets to the public to foster reproducibility and research in this topic.
Digital Object Identifier 10.1109/icra.2019.8794255
Other Identification Number merlin-id:20285
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)