Temporally consistent multi-class video-object segmentation with the video graph-shifts algorithm

Albert Y.C. Chen, Jason J. Corso

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

24 Scopus citations

Abstract

We present the Video Graph-Shifts (VGS) approach for efficiently incorporating temporal consistency into MRF energy minimization for multi-class video object segmentation. In contrast to previous methods, our dynamic temporal links avoid the computational overhead of using a fully connected spatiotemporal MRF, while still being able to deal with the uncertainties of the exact inter-frame pixel correspondence issues. The dynamic temporal links are initialized flexibly for balancing between speed and accuracy, and are automatically revised whenever a label change (shift) occurs during the energy minimization process. We show in the benchmark CamVid database and our own wintry driving dataset that VGS improves the issue of temporally inconsistent segmentation effectively - enhancements of up to 5% to 10% for those semantic classes with high intra-class variance. Furthermore, VGS processes each frame at pixel resolution in about one second, which provides a practical way of modeling complex probabilistic relationships in videos and solving it in near real-time.

Original languageEnglish
Title of host publication2011 IEEE Workshop on Applications of Computer Vision, WACV 2011
Pages614-621
Number of pages8
DOIs
StatePublished - 2011
Event2011 IEEE Workshop on Applications of Computer Vision, WACV 2011 - Kona, HI, United States
Duration: 5 Jan 20117 Jan 2011

Publication series

Name2011 IEEE Workshop on Applications of Computer Vision, WACV 2011

Conference

Conference2011 IEEE Workshop on Applications of Computer Vision, WACV 2011
Country/TerritoryUnited States
CityKona, HI
Period5/01/117/01/11

Fingerprint

Dive into the research topics of 'Temporally consistent multi-class video-object segmentation with the video graph-shifts algorithm'. Together they form a unique fingerprint.

Cite this