TY - JOUR
T1 - Gesture recognition using 3D appearance and motion features
AU - Ye, Guangqi
AU - Corso, Jason J.
AU - Hager, Gregory D.
N1 - Publisher Copyright:
© 2004 IEEE.
PY - 2004
Y1 - 2004
N2 - We present a novel 3D gesture recognition scheme that combines the 3D appearance of the hand and the motion dynamics of the gesture to classify manipulative and controlling gestures. Our method does not directly track the hand. Instead, we take an object-centered approach that efficiently computes the 3D appearance using a region-based coarse stereo matching algorithm in a volume around the hand. The motion cue is captured via differentiating the appearance feature. An unsupervised learning scheme is carried out to capture the cluster structure of these feature-volumes. Then, the image sequence of a gesture is converted to a series of symbols that indicate the cluster identities of each image pair. Two schemes (forward HMMs and neural networks) are used to model the dynamics of the gestures. We implemented a real-time system and performed numerous gesture recognition experiments to analyze the performance with different combinations of the appearance and motion features. The system achieves recognition accuracy of over 96% using both the proposed appearance and the motion cues.
AB - We present a novel 3D gesture recognition scheme that combines the 3D appearance of the hand and the motion dynamics of the gesture to classify manipulative and controlling gestures. Our method does not directly track the hand. Instead, we take an object-centered approach that efficiently computes the 3D appearance using a region-based coarse stereo matching algorithm in a volume around the hand. The motion cue is captured via differentiating the appearance feature. An unsupervised learning scheme is carried out to capture the cluster structure of these feature-volumes. Then, the image sequence of a gesture is converted to a series of symbols that indicate the cluster identities of each image pair. Two schemes (forward HMMs and neural networks) are used to model the dynamics of the gestures. We implemented a real-time system and performed numerous gesture recognition experiments to analyze the performance with different combinations of the appearance and motion features. The system achieves recognition accuracy of over 96% using both the proposed appearance and the motion cues.
UR - http://www.scopus.com/inward/record.url?scp=84932602493&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84932602493&partnerID=8YFLogxK
U2 - 10.1109/CVPR.2004.356
DO - 10.1109/CVPR.2004.356
M3 - Conference article
AN - SCOPUS:84932602493
SN - 2160-7508
VL - 2004-January
JO - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
JF - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
IS - January
M1 - 1384958
T2 - 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2004
Y2 - 27 June 2004 through 2 July 2004
ER -