Not logged in.

Contribution Details

Type Conference or Workshop Paper
Scope Discipline-based scholarship
Published in Proceedings Yes
Title Event-Based Angular Velocity Regression with Spiking Networks
Organization Unit
Authors
  • Mathias Gehrig
  • Sumit Bam Shrestha
  • Daniel Mouritzen
  • Davide Scaramuzza
Presentation Type paper
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
ISBN 978-1-7281-7395-5
Page Range 4195 - 4202
Event Title 2020 IEEE International Conference on Robotics and Automation (ICRA)
Event Type conference
Event Location Paris, France
Event Start Date July 1 - 2020
Event End Date October 1 - 2020
Publisher IEEE
Abstract Text Spiking Neural Networks (SNNs) are bio-inspired networks that process information conveyed as temporal spikes rather than numeric values. An example of a sensor providing such data is the event-camera. It only produces an event when a pixel reports a significant brightness change. Similarly, the spiking neuron of an SNN only produces a spike whenever a significant number of spikes occur within a short period of time. Due to their spike-based computational model, SNNs can process output from event-based, asynchronous sensors without any pre-processing at extremely lower power unlike standard artificial neural networks. This is possible due to specialized neuromorphic hardware that implements the highly-parallelizable concept of SNNs in silicon. Yet, SNNs have not enjoyed the same rise of popularity as artificial neural networks. This not only stems from the fact that their input format is rather unconventional but also due to the challenges in training spiking networks. Despite their temporal nature and recent algorithmic advances, they have been mostly evaluated on classification problems. We propose, for the first time, a temporal regression problem of numerical values given events from an event-camera. We specifically investigate the prediction of the 3- DOF angular velocity of a rotating event-camera with an SNN. The difficulty of this problem arises from the prediction of angular velocities continuously in time directly from irregular, asynchronous event-based input. Directly utilising the output of event-cameras without any pre-processing ensures that we inherit all the benefits that they provide over conventional cameras. That is high-temporal resolution, high-dynamic range and no motion blur. To assess the performance of SNNs on this task, we introduce a synthetic event-camera dataset generated from real-world panoramic images and show that we can successfully train an SNN to perform angular velocity regression.
Related URLs
Digital Object Identifier 10.1109/icra40945.2020.9197133
Other Identification Number merlin-id:20310
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)