Not logged in.

Contribution Details

Type Journal Article
Scope Discipline-based scholarship
Title Event-based, 6-DOF Camera Tracking from Photometric Depth Maps
Organization Unit
Authors
  • Guillermo Gallego
  • Jon E A Lund
  • Elias Müggler
  • Henri Rebecq
  • Tobi Delbruck
  • Davide Scaramuzza
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
Journal Title IEEE Transactions on Pattern Analysis and Machine Intelligence
Publisher Institute of Electrical and Electronics Engineers
Geographical Reach international
ISSN 0098-5589
Volume 1
Number 1
Page Range 1 - 10
Date 2017
Abstract Text Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high-speed motions or in scenes characterized by high dynamic range. These features, along with a very low power consumption, make event cameras an ideal complement to standard cameras for VR/AR and video game applications. With these applications in mind, this paper tackles the problem of accurate, low-latency tracking of an event camera from an existing photometric depth map (i.e., intensity plus depth information) built via classic dense reconstruction pipelines. Our approach tracks the 6-DOF pose of the event camera upon the arrival of each event, thus virtually eliminating latency. We successfully evaluate the method in both indoor and outdoor scenes and show that—because of the technological advantages of the event camera—our pipeline works in scenes characterized by high-speed motion, which are still unaccessible to standard cameras.
Zusammenfassung Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high-speed motions or in scenes characterized by high dynamic range. These features, along with a very low power consumption, make event cameras an ideal complement to standard cameras for VR/AR and video game applications. With these applications in mind, this paper tackles the problem of accurate, low-latency tracking of an event camera from an existing photometric depth map (i.e., intensity plus depth information) built via classic dense reconstruction pipelines. Our approach tracks the 6-DOF pose of the event camera upon the arrival of each event, thus virtually eliminating latency. We successfully evaluate the method in both indoor and outdoor scenes and show that—because of the technological advantages of the event camera—our pipeline works in scenes characterized by high-speed motion, which are still unaccessible to standard cameras.
Free access at Official URL
Official URL http://rpg.ifi.uzh.ch/docs/PAMI17_Gallego.pdf
Digital Object Identifier 10.1109/tpami.2017.2769655
Other Identification Number merlin-id:16261
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)