Consumer-friendly “smart glasses” that are currently being developed by major technology companies will be an important new element of digital health research, argue three Duke scientists in a commentary piece published March 12 in npj Digital Medicine.
The authors, all from the Department of Psychiatry and Behavioral Sciences, are Matthew M. Engelhard, MD, PhD, senior research associate; Jason A. Oliver, PhD, assistant professor; and F. Joseph McClernon, PhD, professor.
These technologies, paired with machine learning models that can interpret the images they capture, can help scientists better understand how a person’s environment contributes to health behaviors and outcomes.
For years now, wearable technologies like smart watches, body monitors, and fitness trackers have allowed for the collection of information about a person’s health and everyday behaviors, everything from a person’s sleep, device use and even cardiovascular and neurological events.
What often goes unmeasured, however, is the person’s surrounding environment—a piece that is critical to understanding health challenges and designing appropriate interventions. When a person has a panic attack, their heart rate and perspiration can be assessed with current wearables, but these devices do not provide additional information about the social or environmental conditions that triggered the attack. When studying obesity and cardiovascular health, current devices monitor sedentary behavior and weight gain, but not the work, home or neighborhood environments that discourage physical activity and healthy eating.
Engelhard said that the team has already developed a prototype to identify daily environments that pose high/low smoking risks using smart phone cameras.
“The paradigm could be applied to a wide range of behaviors and symptoms and should be given more attention and included in digital phenotyping initiatives,” he said. Examples of these initiatives include Project Baseline and the All of Us program from the National Institutes of Health.