Not logged in.

Contribution Details

Type Journal Article
Scope Discipline-based scholarship
Title Continuous-Time Visual-Inertial Odometry for Event Cameras
Organization Unit
Authors
  • Elias Mueggler
  • Guillermo Gallego
  • Henri Rebecq
  • Davide Scaramuzza
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
Journal Title IEEE Transactions on Robotics
Publisher Institute of Electrical and Electronics Engineers
Geographical Reach international
ISSN 1552-3098
Volume 34
Number 6
Page Range 1425 - 1440
Date 2019
Abstract Text Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. They offer significant advantages over standard cameras, namely a very high dynamic range, no motion blur, and a latency in the order of microseconds. However, due to the fundamentally different structure of the sensor’s output, new algorithms that exploit the high temporal resolution and the asynchronous nature of the sensor are required. Recent work has shown that a continuous-time representation of the event camera pose can deal with the high temporal resolution and asynchronous nature of this sensor in a principled way. In this paper, we leverage such a continuous-time representation to perform visual-inertial odometry with an event camera. This representation allows direct integration of the asynchronous events with micro-second accuracy and the inertial measurements at high frequency. The event camera trajectory is approximated by a smooth curve in the space of rigid-body motions using cubic splines. This formulation significantly reduces the number of variables in trajectory estimation problems. We evaluate our method on real data from several scenes and compare the results against ground truth from a motion-capture system. We show that our method provides improved accuracy over the result of a state-of-the-art visual odometry method for event cameras. We also show that both the map orientation and scale can be recovered accurately by fusing events and inertial data. To the best of our knowledge, this is the first work on visual-inertial fusion with event cameras using a continuous-time framework.
Official URL http://rpg.ifi.uzh.ch/docs/TRO18_Mueggler.pdf
Digital Object Identifier 10.1109/tro.2018.2858287
Other Identification Number merlin-id:18689
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)
Additional Information © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.