Egocentric Computer Vision for Hands-Free Robotic Wheelchair Navigation

Mohammed Kutbi, Haoxiang Li, Yizhe Chang, Bo Sun, Xin Li, Changjiang Cai, Nikolaos Agadakos, Gang Hua, Philippos Mordohai

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

In this paper, we present an approach for navigating a robotic wheelchair that provides users with multiple levels of autonomy and navigation capabilities to fit their individual needs and preferences. We focus on three main aspects: (i) egocentric computer vision based motion control to provide a natural human-robot interface to wheelchair users with impaired hand usage; (ii) techniques that enable user to initiate autonomous navigation to a location, object or person without use of the hands; and (iii) a framework that learns to navigate the wheelchair according to its user’s, often subjective, criteria and preferences. These contributions are evaluated qualitatively and quantitatively in user studies with several subjects demonstrating their effectiveness. These studies have been conducted with healthy subjects, but they still indicate that clinical tests of the proposed technology can be initiated.

Original languageEnglish
Article number10
JournalJournal of Intelligent and Robotic Systems: Theory and Applications
Volume107
Issue number1
DOIs
StatePublished - Jan 2023

Keywords

  • Assistive technology
  • Egocentric camera
  • Robotic wheelchair

Fingerprint

Dive into the research topics of 'Egocentric Computer Vision for Hands-Free Robotic Wheelchair Navigation'. Together they form a unique fingerprint.

Cite this