Not logged in.

Contribution Details

Type Journal Article
Scope Discipline-based scholarship
Title Bridging the Gap Between Events and Frames Through Unsupervised Domain Adaptation
Organization Unit
Authors
  • Nico Messikommer
  • Daniel Gehrig
  • Mathias Gehrig
  • Davide Scaramuzza
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
Journal Title IEEE Robotics and Automation Letters
Publisher Institute of Electrical and Electronics Engineers
Geographical Reach international
ISSN 2377-3766
Volume 7
Number 2
Page Range 3515 - 3522
Date 2022
Abstract Text Reliable perception during fast motion maneuvers or in high dynamic range environments is crucial for robotic systems. Since event cameras are robust to these challenging conditions, they have great potential to increase the reliability of robot vision. However, event-based vision has been held back by the shortage of labeled datasets due to the novelty of event cameras. To overcome this drawback, we propose a task transfer method to train models directly with labeled images and unlabeled event data. Compared to previous approaches, (i) our method transfers from single images to events instead of high frame rate videos, and (ii) does not rely on paired sensor data. To achieve this, we leverage the generative event model to split event features into content and motion features. This split enables efficient matching between latent spaces for events and images, which is crucial for successful task transfer. Thus, our approach unlocks the vast amount of existing image datasets for the training of event-based neural networks. Our task transfer method consistently outperforms methods targeting Unsupervised Domain Adaptation for object detection by 0.26 mAP (increase by 93%) and classification by 2.7% accuracy.
Digital Object Identifier 10.1109/LRA.2022.3145053
Other Identification Number merlin-id:22182
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)