TY - GEN
T1 - A collaborative visual localization scheme for a low-cost heterogeneous robotic team with non-overlapping perspectives
AU - Abruzzo, Benjamin
AU - Cappelleri, David
AU - Mordohai, Philippos
N1 - Publisher Copyright:
Copyright © 2019 ASME.
PY - 2019
Y1 - 2019
N2 - This paper presents and evaluates a relative localization scheme for a heterogeneous team of low-cost mobile robots. An error-state, complementary Kalman Filter was developed to fuse analytically-derived uncertainty of stereoscopic pose measurements of an aerial robot, made by a ground robot, with the inertial/visual proprioceptive measurements of both robots. Results show that the sources of error, image quantization, asynchronous sensors, and a non-stationary bias, were sufficiently modeled to estimate the pose of the aerial robot. In both simulation and experiments, we demonstrate the proposed methodology with a heterogeneous robot team, consisting of a UAV and a UGV tasked with collaboratively localizing themselves while avoiding obstacles in an unknown environment. The team is able to identify a goal location and obstacles in the environment and plan a path for the UGV to the goal location. The results demonstrate localization accuracies of 2cm to 4cm, on average, while the robots operate at a distance from each-other between 1m and 4m.
AB - This paper presents and evaluates a relative localization scheme for a heterogeneous team of low-cost mobile robots. An error-state, complementary Kalman Filter was developed to fuse analytically-derived uncertainty of stereoscopic pose measurements of an aerial robot, made by a ground robot, with the inertial/visual proprioceptive measurements of both robots. Results show that the sources of error, image quantization, asynchronous sensors, and a non-stationary bias, were sufficiently modeled to estimate the pose of the aerial robot. In both simulation and experiments, we demonstrate the proposed methodology with a heterogeneous robot team, consisting of a UAV and a UGV tasked with collaboratively localizing themselves while avoiding obstacles in an unknown environment. The team is able to identify a goal location and obstacles in the environment and plan a path for the UGV to the goal location. The results demonstrate localization accuracies of 2cm to 4cm, on average, while the robots operate at a distance from each-other between 1m and 4m.
UR - http://www.scopus.com/inward/record.url?scp=85076468118&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85076468118&partnerID=8YFLogxK
U2 - 10.1115/DETC2019-97377
DO - 10.1115/DETC2019-97377
M3 - Conference contribution
AN - SCOPUS:85076468118
T3 - Proceedings of the ASME Design Engineering Technical Conference
BT - 43rd Mechanisms and Robotics Conference
T2 - ASME 2019 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC-CIE 2019
Y2 - 18 August 2019 through 21 August 2019
ER -