With the development of small inexpensive motion sensors, it has become relatively easy to include the sensors in a wide variety of wearable devices to estimate calories burned. However, calories burned are only part of the energy balance equation. The other half is caloric intake or, even better, information about the specific types and amounts of food consumed. Although there has been excitement about some crowdfunding efforts to measure caloric intake and some progress has been made, automated monitoring of caloric intake remains challenging. This article briefly reviews the progress toward wearable technology to help automate caloric intake and/or food consumption monitoring.
We have come a long way from tracking calories with a pencil and paper. Applications such as MyFitnessPal make it much easier to manually enter information on the types and amounts of food one consumes. Also, there has been progress in automated recognition of food types and amounts based on pictures of food. For example, recently announced Im2Calories research at Google uses artificial intelligence to identify food types and amounts from a picture of food. However, most current approaches still require a person to perform an action when they eat. For example, a person has to manually enter information or manually position an imaging device to take a picture of a meal or snack. If the person does not take the action, then a caloric intake monitoring system is oblivious to a meal or snack. Compliance with diets has historically been problematic and so too is any caloric intake monitoring system that relies on such actions for completeness.
During the past couple years, there have been a number of efforts to help automate caloric intake monitoring. Some of these efforts may have gotten a bit too enthusiastic about what they can do. Sometimes crowd-funding and/or marketing videos, even with good intentions, can get ahead of engineering and empirical results. One crowd-funded project issued refunds. The jury is still out on data from other projects. Some caloric intake monitoring claims have sparked controversy and peer-reviewed experimental data is sparse. Nonetheless, there is a lot of interest and a lot of work underway, so one or more of these efforts may produce a helpful wearable caloric intake monitor.
Let's now look at some of the current contenders. I should disclose that my company Medibotics is working in this area, so I have a "horse in the race." However, I will try to be even-handed, even when reviewing potential competitors. With this disclosure, let's take a look at the contenders:
AIRO Watch: Launched on Indiegogo in 2013 by AIRO Health, the AIRO watch was intended to be a wristband that uses spectroscopic analysis of body tissue in order to measure caloric intake. They issued a refund to backers and news about them has been sparse since.
Healbe GoBe: Lauched on Indiegogo in 2014 by Healbe, this is a wearable wristband that is reported to measure the impedance of body tissue in order to measure caloric intake. Their Indiegogo campaign raised over a million dollars and also generated controversy online. Healbe has released data from experiments with the device, but there does not seem to be results from a peer-reviewed journal yet. The current version of the device appears to require the user to press the device before eating in order to estimate calories for a meal or snack. I am trying to keep an open mind and think that the jury is still out on the performance of this device.
TellSpec: Lauched on Indiegogo in 2014, TellSpec was intended to be a hand-held device which uses spectroscopy to measure the nutrient composition of food. TellSpec raised over $350,000. Since it would be a hand-held device, it is not technically a wearable, but it could be used to help automate caloric intake monitoring and/or in conjunction with a wearable device. Some people did not realize that their Indiegogo video was a conceptual video, not a video of an actual device in operation, but conceptual videos are often used for Indiegogo. Even if the device can automatically measure the nutritional composition of food, it is not clear whether it would automatically measure the amount of food. There is some indication that they are changing the type of technology which they plan to use to measure food composition.
SCiO: SCiO is a hand-held molecular sensor by Consumer Physics which uses near-infrared spectroscopy to analyze the composition of nearby objects. Although it is not targeted exclusively for measuring food composition, it appears to be useful for this purpose. Since it is a hand-held device, it is not technically a wearable, but it can be used to help automate caloric intake monitoring and/or in conjunction with a wearable device. It is not clear whether it would automatically measure the amount of food. There appears to be scientific evidence concerning the operation of this device for determining the molecular composition of nearby objects.
BitBite: Lauched on Indiegogo in 2014-15, BitBite is an ear-worn device which is intended to automatically measure caloric intake by analyzing the sounds of eating. It appears that quantity of food consumed will be automatically measured based on eating sounds (such as chewing or swallowing). It appears that the device will also have a speech recognition capability which allows the wearer to provide additional information concerning the specific types of food being consumed.
WearSens: Developed by researchers at UCLA and announced in 2015, WearSens is intended to be a necklace which automatically measures caloric intake by analyzing eating sounds and vibrations.
Bite Technologies: The bite counter developed by researchers at Clemson University is a wrist band which measures caloric intake by recognizing arm motions associated with eating. It appears that this device currently focuses on the quantity (number of bites) of food consumed. Motion sensing alone does not appear to provide information on the type of food consumed.
Willpower Watch: Still under development by Medibotics, the Willpower Watch is intended to be a wrist-worn device which automatically takes pictures of food (including food-hand interactions and food-mouth interactions) in order to measure the types and amounts of food consumed. Picture taking can be triggered when a motion sensor indicates that a person is probably eating so that the device does not intrude on privacy by taking pictures all the time.
eButton: Developed by researchers at the University of Pittsburgh, the eButton is currently embodied in a wearable camera and a process for reviewing pictures from that camera in order to estimate the types and amounts of food consumed by the wearer.
Technology Assisted Dietary Assessment (TADA): Developed by researchers at Purdue University, the Technology Assisted Dietary Assessment (TADA) project developed methods to automatically analyze pictures in order to estimate the types and amounts of food in a picture.
Google/Im2Calories: Developed by researchers at Google, Im2Calories uses artificial intelligence to analyze pictures in order to estimate the types and amounts of food in a picture.
To recap, there are a number of different approaches being pursued in efforts to develop a wearable device to help automate monitoring of the types and amounts of food consumed. These approaches include: spectroscopic analysis of body tissue; spectroscopic analysis of food; impedance analysis of body tissue; analysis of sounds and vibrations associated with eating; analysis of arm motions associated with eating; and analysis of food pictures.
It is too early to tell whether one of these approaches, or a combination of them, will result in a useful caloric intake or food consumption monitor. However, there is a high level of interest in this topic and there are probably even more researchers working on it than those included in the above list. With this high level of interest and development work underway, it is likely that a working caloric intake or food consumption monitor will become a reality in the next two years. This could even become a "killer app" advantage for wearable devices over hand-held devices.
Robert A. Connor, Ph.D. is the CEO of Medibotics which is developing products at the convergence of medical devices and wearable technology, including the Willpower Watch (TM) for measuring food consumption and Motion Recognition Clothing (TM) for mobile full-body motion capture.