Automated estimation of food type and amount consumed from body-worn audio and motion sensors

Mark Mirtchouk, Christopher Merck, Samantha Kleinberg

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

82 Scopus citations

Abstract

Determining when an individual is eating can be useful for tracking behavior and identifying patterns, but to create nutrition logs automatically or provide real-time feedback to people with chronic disease, we need to identify both what they are consuming and in what quantity. However, food type and amount have mainly been estimated using image data (requiring user involvement) or acoustic sensors (tested with a restricted set of foods rather than representative meals). As a result, there is not yet a highly accurate automated nutrition monitoring method that can be used with a variety of foods. We propose that multi-modal sensing (in-ear audio plus head and wrist motion) can be used to more accurately classify food type, as audio and motion features provide complementary information. Further, we propose that knowing food type is critical for estimating amount consumed in combination with sensor data. To test this we use data from people wearing audio and motion sensors, with ground truth annotated from video and continuous scale data. With data from 40 unique foods we achieve a classification accuracy of 82.7% with a combination of sensors (versus 67.8% for audio alone and 76.2% for head and wrist motion). Weight estimation error was reduced from a baseline of 127.3% to 35.4% absolute relative error. Ultimately, our estimates of food type and amount can be linked to food databases to provide automated calorie estimates from continuously-collected data.

Original languageEnglish
Title of host publicationUbiComp 2016 - Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing
Pages451-462
Number of pages12
ISBN (Electronic)9781450344616
DOIs
StatePublished - 12 Sep 2016
Event2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2016 - Heidelberg, Germany
Duration: 12 Sep 201616 Sep 2016

Publication series

NameUbiComp 2016 - Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing

Conference

Conference2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2016
Country/TerritoryGermany
CityHeidelberg
Period12/09/1616/09/16

Keywords

  • Acoustic and motion sensing
  • Eating recognition
  • Nutrition

Fingerprint

Dive into the research topics of 'Automated estimation of food type and amount consumed from body-worn audio and motion sensors'. Together they form a unique fingerprint.

Cite this