Action detection by implicit intentional motion clustering

Wei Chen, Jason J. Corso

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

37 Scopus citations

Abstract

Explicitly using human detection and pose estimation has found limited success in action recognition problems. This may be due to the complexity in the articulated motion human exhibit. Yet, we know that action requires an actor and intention. This paper hence seeks to understand the spatiotemporal properties of intentional movement and how to capture such intentional movement without relying on challenging human detection and tracking. We conduct a quantitative analysis of intentional movement, and our findings motivate a new approach for implicit intentional movement extraction that is based on spatiotemporal trajectory clustering by leveraging the properties of intentional movement. The intentional movement clusters are then used as action proposals for detection. Our results on three action detection benchmarks indicate the relevance of focusing on intentional movement for action detection, our method significantly outperforms the state of the art on the challenging MSR-II multi-action video benchmark.

Original languageEnglish
Title of host publication2015 International Conference on Computer Vision, ICCV 2015
Pages3298-3306
Number of pages9
ISBN (Electronic)9781467383912
DOIs
StatePublished - 17 Feb 2015
Event15th IEEE International Conference on Computer Vision, ICCV 2015 - Santiago, Chile
Duration: 11 Dec 201518 Dec 2015

Publication series

NameProceedings of the IEEE International Conference on Computer Vision
Volume2015 International Conference on Computer Vision, ICCV 2015
ISSN (Print)1550-5499

Conference

Conference15th IEEE International Conference on Computer Vision, ICCV 2015
Country/TerritoryChile
CitySantiago
Period11/12/1518/12/15

Fingerprint

Dive into the research topics of 'Action detection by implicit intentional motion clustering'. Together they form a unique fingerprint.

Cite this