Publikation

Analysis of chewing sounds for dietary monitoring

Oliver Amft, Mathias Stäger, Paul Lukowicz, Gerhard Tröster

In: Proceedings of the 7th International Conference on Ubiquitous Computing. International Conference on Ubiquitous Computing (Ubicomp-2005) 7th September 11-14 Tokyo Japan Seiten 56-72 Lecture Notes in Computer Science (LNCS) 3660 ISBN 3-540-28760-4, 978-3-540-28760-5 Springer-Verlag 2005.

Abstrakt

The paper reports the results of the first stage of our work on an automatic dietary monitoring system. The work is part of a large European project on using ubiquitous systems to support healthy lifestyle and cardiovascular disease prevention. We demonstrate that sound from the user's mouth can be used to detect that he/she is eating. The paper also shows how different kinds of food can be recognized by analyzing chewing sounds. The sounds are acquired with a microphone located inside the ear canal. This is an unobtrusive location widely accepted in other applications (hearing aids, headsets). To validate our method we present experimental results containing 3500 seconds of chewing data from four subjects on four different food types typically found in a meal. Up to 99% accuracy is achieved on eating recognition and between 80% to 100% on food type classification.

Weitere Links

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence