TY - GEN
T1 - An egocentric computer vision based co-robot wheelchair
AU - Li, Haoxiang
AU - Kutbi, Mohammed
AU - Li, Xin
AU - Cai, Changjiang
AU - Mordohai, Philippos
AU - Hua, Gang
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/11/28
Y1 - 2016/11/28
N2 - Motivated by the emerging needs to improve the quality of life for the elderly and disabled individuals who rely on wheelchairs for mobility, and who might have limited or no hand functionality at all, we propose an egocentric computer vision based co-robot wheelchair to enhance their mobility without hand usage. The co-robot wheelchair is built upon a typical commercial power wheelchair. The user can access 360 degrees of motion direction as well as a continuous range of speed without the use of hands via the egocentric computer vision based control we developed. The user wears an egocentric camera and collaborates with the robotic wheelchair by conveying the motion commands with head motions. Compared with previous sip-n-puff, chin-control and tongue-operated solutions to hands-free mobility, this egocentric computer vision based control system provides a more natural human robot interface. Our experiments show that this design is of higher usability and users can quickly learn to control and operate the wheelchair. Besides its convenience in manual navigation, the egocentric camera also supports novel user-robot interaction modes by enabling autonomous navigation towards a detected person or object of interest. User studies demonstrate the usability and efficiency of the proposed egocentric computer vision co-robot wheelchair.
AB - Motivated by the emerging needs to improve the quality of life for the elderly and disabled individuals who rely on wheelchairs for mobility, and who might have limited or no hand functionality at all, we propose an egocentric computer vision based co-robot wheelchair to enhance their mobility without hand usage. The co-robot wheelchair is built upon a typical commercial power wheelchair. The user can access 360 degrees of motion direction as well as a continuous range of speed without the use of hands via the egocentric computer vision based control we developed. The user wears an egocentric camera and collaborates with the robotic wheelchair by conveying the motion commands with head motions. Compared with previous sip-n-puff, chin-control and tongue-operated solutions to hands-free mobility, this egocentric computer vision based control system provides a more natural human robot interface. Our experiments show that this design is of higher usability and users can quickly learn to control and operate the wheelchair. Besides its convenience in manual navigation, the egocentric camera also supports novel user-robot interaction modes by enabling autonomous navigation towards a detected person or object of interest. User studies demonstrate the usability and efficiency of the proposed egocentric computer vision co-robot wheelchair.
UR - http://www.scopus.com/inward/record.url?scp=85006474653&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85006474653&partnerID=8YFLogxK
U2 - 10.1109/IROS.2016.7759291
DO - 10.1109/IROS.2016.7759291
M3 - Conference contribution
AN - SCOPUS:85006474653
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 1829
EP - 1836
BT - IROS 2016 - 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems
T2 - 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2016
Y2 - 9 October 2016 through 14 October 2016
ER -