Enabling access through real-time sign language communication over cell phones

Jaehong Chon, Neva Cherniavsky, Eve A. Riskin, Richard E. Ladner

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

The primary challenge to enabling real-time twoway video conferencing on a cell phone is overcoming the limited bandwidth, computation and power. The goal of the MobileASL project is to enable access for people who use American Sign Language (ASL) to an off-the-shelf mobile phone through the implementation of real-time mobile video communication. The enhancement of processor, bandwidth, and power efficiency is investigated through SIMD optimization; Region-of-interest encoding based on skin detection; Video resolution selection (used to determine the best trade off between frame rate and spatial resolution); And variable frame rates based on activity recognition. Our prototype system is able to compress, transmit, and decode 12-15 frames per second in real-time and produce intelligible ASL at 30 kbps. Furthermore, we can achieve up to 23 extra minutes of talk time, or a 8% gain over the battery life of the phone, through our frame dropping technique.

Original languageEnglish
Title of host publicationConference Record - 43rd Asilomar Conference on Signals, Systems and Computers
Pages588-592
Number of pages5
DOIs
StatePublished - 2009
Event43rd Asilomar Conference on Signals, Systems and Computers - Pacific Grove, CA, United States
Duration: 1 Nov 20094 Nov 2009

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
ISSN (Print)1058-6393

Conference

Conference43rd Asilomar Conference on Signals, Systems and Computers
Country/TerritoryUnited States
CityPacific Grove, CA
Period1/11/094/11/09

Fingerprint

Dive into the research topics of 'Enabling access through real-time sign language communication over cell phones'. Together they form a unique fingerprint.

Cite this