TY - GEN
T1 - Rotation estimation from cloud tracking
AU - Cho, Sangwoo
AU - Dunn, Enrique
AU - Frahm, Jan Michael
PY - 2014
Y1 - 2014
N2 - We address the problem of online relative orientation estimation from streaming video captured by a sky-facing camera on a mobile device. Namely, we rely on the detection and tracking of visual features attained from cloud structures. Our proposed method achieves robust and efficient operation by combining realtime visual odometry modules, learning based feature classification, and Kalman filtering within a robustness-driven data management framework, while achieving framerate processing on a mobile device. The relatively large 3D distance between the camera and the observed cloud features is leveraged to simplify our processing pipeline. First, as an efficiency driven optimization, we adopt a homography based motion model and focus on estimating relative rotations across adjacent keyframes. To this end, we rely on efficient feature extraction, KLT tracking, and RANSAC based model fitting. Second, to ensure the validity of our simplified motion model, we segregate detected cloud features from scene features through SVM classification. Finally, to make tracking more robust, we employ predictive Kalman filtering to enable feature persistence through temporary occlusions and manage feature spatial distribution to foster tracking robustness. Results exemplify the accuracy and robustness of the proposed approach and highlight its potential as a passive orientation sensor.
AB - We address the problem of online relative orientation estimation from streaming video captured by a sky-facing camera on a mobile device. Namely, we rely on the detection and tracking of visual features attained from cloud structures. Our proposed method achieves robust and efficient operation by combining realtime visual odometry modules, learning based feature classification, and Kalman filtering within a robustness-driven data management framework, while achieving framerate processing on a mobile device. The relatively large 3D distance between the camera and the observed cloud features is leveraged to simplify our processing pipeline. First, as an efficiency driven optimization, we adopt a homography based motion model and focus on estimating relative rotations across adjacent keyframes. To this end, we rely on efficient feature extraction, KLT tracking, and RANSAC based model fitting. Second, to ensure the validity of our simplified motion model, we segregate detected cloud features from scene features through SVM classification. Finally, to make tracking more robust, we employ predictive Kalman filtering to enable feature persistence through temporary occlusions and manage feature spatial distribution to foster tracking robustness. Results exemplify the accuracy and robustness of the proposed approach and highlight its potential as a passive orientation sensor.
UR - http://www.scopus.com/inward/record.url?scp=84904699359&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84904699359&partnerID=8YFLogxK
U2 - 10.1109/WACV.2014.6836006
DO - 10.1109/WACV.2014.6836006
M3 - Conference contribution
AN - SCOPUS:84904699359
SN - 9781479949854
T3 - 2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014
SP - 917
EP - 924
BT - 2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014
T2 - 2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014
Y2 - 24 March 2014 through 26 March 2014
ER -