TY - GEN
T1 - Mutual localization
T2 - 2013 26th IEEE/RSJ International Conference on Intelligent Robots and Systems: New Horizon, IROS 2013
AU - Dhiman, Vikas
AU - Ryde, Julian
AU - Corso, Jason J.
PY - 2013
Y1 - 2013
N2 - Concurrently estimating the 6-DOF pose of multiple cameras or robots - cooperative localization - is a core problem in contemporary robotics. Current works focus on a set of mutually observable world landmarks and often require inbuilt egomotion estimates; situations in which both assumptions are violated often arise, for example, robots with erroneous low quality odometry and IMU exploring an unknown environment. In contrast to these existing works in cooperative localization, we propose a cooperative localization method, which we call mutual localization, that uses reciprocal observations of camera-fiducials to obviate the need for egomotion estimates and mutually observable world landmarks. We formulate and solve an algebraic formulation for the pose of the two camera mutual localization setup under these assumptions. Our experiments demonstrate the capabilities of our proposal egomotion-free cooperative localization method: for example, the method achieves 2cm range and 0.7 degree accuracy at 2m sensing for 6-DOF pose. To demonstrate the applicability of the proposed work, we deploy our method on Turtlebots and we compare our results with ARToolKit [1] and Bundler [2], over which our method achieves a tenfold improvement in translation estimation accuracy.
AB - Concurrently estimating the 6-DOF pose of multiple cameras or robots - cooperative localization - is a core problem in contemporary robotics. Current works focus on a set of mutually observable world landmarks and often require inbuilt egomotion estimates; situations in which both assumptions are violated often arise, for example, robots with erroneous low quality odometry and IMU exploring an unknown environment. In contrast to these existing works in cooperative localization, we propose a cooperative localization method, which we call mutual localization, that uses reciprocal observations of camera-fiducials to obviate the need for egomotion estimates and mutually observable world landmarks. We formulate and solve an algebraic formulation for the pose of the two camera mutual localization setup under these assumptions. Our experiments demonstrate the capabilities of our proposal egomotion-free cooperative localization method: for example, the method achieves 2cm range and 0.7 degree accuracy at 2m sensing for 6-DOF pose. To demonstrate the applicability of the proposed work, we deploy our method on Turtlebots and we compare our results with ARToolKit [1] and Bundler [2], over which our method achieves a tenfold improvement in translation estimation accuracy.
UR - http://www.scopus.com/inward/record.url?scp=84893731604&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84893731604&partnerID=8YFLogxK
U2 - 10.1109/IROS.2013.6696524
DO - 10.1109/IROS.2013.6696524
M3 - Conference contribution
AN - SCOPUS:84893731604
SN - 9781467363587
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 1347
EP - 1354
BT - IROS 2013
Y2 - 3 November 2013 through 8 November 2013
ER -