Not logged in.

Contribution Details

Type Journal Article
Scope Discipline-based scholarship
Title Dense Continuous-Time Optical Flow from Event Cameras
Organization Unit
Authors
  • Mathias Gehrig
  • Manasi Muglikar
  • Davide Scaramuzza
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
Journal Title IEEE Transactions on Pattern Analysis and Machine Intelligence
Publisher Institute of Electrical and Electronics Engineers
Geographical Reach international
ISSN 0162-8828
Page Range 1 - 12
Date 2024
Abstract Text We present a method for estimating dense continuous-time optical flow from event data. Traditional dense optical flow methods compute the pixel displacement between two images. Due to missing information, these approaches cannot recover the pixel trajectories in the blind time between two images. In this work, we show that it is possible to compute per-pixel, continuous-time optical flow using events from an event camera. Events provide temporally fine-grained information about movement in pixel space due to their asynchronous nature and microsecond response time. We leverage these benefits to predict pixel trajectories densely in continuous time via parameterized Bézier curves. To achieve this, we build a neural network with strong inductive biases for this task: First, we build multiple sequential correlation volumes in time using event data. Second, we use Bézier curves to index these correlation volumes at multiple timestamps along the trajectory. Third, we use the retrieved correlation to update the Bézier curve representations iteratively. Our method can optionally include image pairs to boost performance further. To the best of our knowledge, our model is the first method that can regress dense pixel trajectories from event data. To train and evaluate our model, we introduce a synthetic dataset (MultiFlow) that features moving objects and ground truth trajectories for every pixel. Our quantitative experiments not only suggest that our method successfully predicts pixel trajectories in continuous time but also that it is competitive in the traditional two-view pixel displacement metric on MultiFlow and DSEC-Flow. Open source code and datasets are released to the public.
Digital Object Identifier 10.1109/TPAMI.2024.3361671
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)