Voldor: Visual odometry from log-logistic dense optical flow residuals

Zhixiang Min, Yiding Yang, Enrique Dunn

Research output: Contribution to journalConference articlepeer-review

37 Scopus citations

Abstract

We propose a dense indirect visual odometry method taking as input externally estimated optical flow fields instead of hand-crafted feature correspondences. We define our problem as a probabilistic model and develop a generalized-EM formulation for the joint inference of camera motion, pixel depth, and motion-track confidence. Contrary to traditional methods assuming Gaussian-distributed observation errors, we supervise our inference framework under an (empirically validated) adaptive log-logistic distribution model. Moreover, the log-logistic residual model generalizes well to different state-of-the-art optical flow methods, making our approach modular and agnostic to the choice of optical flow estimators. Our method achieved top-ranking results on both TUM RGB-D and KITTI odometry benchmarks. Our open-sourced implementation 1 is inherently GPU-friendly with only linear computational and storage growth.

Original languageEnglish
Article number9156715
Pages (from-to)4897-4908
Number of pages12
JournalProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
DOIs
StatePublished - 2020
Event2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020 - Virtual, Online, United States
Duration: 14 Jun 202019 Jun 2020

Fingerprint

Dive into the research topics of 'Voldor: Visual odometry from log-logistic dense optical flow residuals'. Together they form a unique fingerprint.

Cite this