Abstract
In this paper, we present an approach for navigating a robotic wheelchair that provides users with multiple levels of autonomy and navigation capabilities to fit their individual needs and preferences. We focus on three main aspects: (i) egocentric computer vision based motion control to provide a natural human-robot interface to wheelchair users with impaired hand usage; (ii) techniques that enable user to initiate autonomous navigation to a location, object or person without use of the hands; and (iii) a framework that learns to navigate the wheelchair according to its user’s, often subjective, criteria and preferences. These contributions are evaluated qualitatively and quantitatively in user studies with several subjects demonstrating their effectiveness. These studies have been conducted with healthy subjects, but they still indicate that clinical tests of the proposed technology can be initiated.
| Original language | English |
|---|---|
| Article number | 10 |
| Journal | Journal of Intelligent and Robotic Systems: Theory and Applications |
| Volume | 107 |
| Issue number | 1 |
| DOIs | |
| State | Published - Jan 2023 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 3 Good Health and Well-being
Keywords
- Assistive technology
- Egocentric camera
- Robotic wheelchair
Fingerprint
Dive into the research topics of 'Egocentric Computer Vision for Hands-Free Robotic Wheelchair Navigation'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver