Duke Health

Skip to main content

News & Media

News & Media Front Page

Study Shows Smartphone App Can Identify Autism Symptoms in Toddlers

Study shows the ubiquitous technology holds potential as an early autism screening device

Baby with App
Baby with App

Contact

Stephanie Lopez
Stephanie Lopez
Manager
919-724-5934 Email

DURHAM, N.C. – A digital app successfully detected one of the telltale characteristics of autism in young children, suggesting the technology could one day become an inexpensive and scalable early screening tool, researchers at Duke University report.

The research team created the app to assess the eye gaze patterns of children while they watched short, strategically designed movies on an iPhone or iPad, then applied computer vision and machine learning to determine whether the child was looking more often at the human in the video, or objects.

“We know that babies who have autism pay attention to the environment differently and are not paying as much attention to people,” said Geraldine Dawson, Ph.D., director of the Duke Center for Autism and Brain Development, and co-senior author of a study appearing online April 26 in JAMA Pediatrics. Read the paper at this link.

“We can track eye gaze patterns in toddlers to assess risk for autism,” Dawson said. “This is the first time that we’ve been able to provide this type of assessment using only a smart phone or tablet. This study served as a proof-of-concept, and we’re very encouraged.”

Dawson and colleagues -- including lead author Zhuoqing Chang, Ph.D., postdoctoral associate in Duke’s Department of Electrical and Computer Engineering -- began collaborating with Dawson to develop the app several years ago. In this latest version, the researchers strategically designed movies that would enable them to assess a young child’s preference for looking at objects more than at people. 

One movie, for example, shows a cheerful woman playing with a top. She dominates one side of the screen while the top she is spinning is on the other side. Toddlers without autism scanned the entire screen throughout the video, focusing more often on the woman. Toddlers who were later diagnosed with autism, however, more often focused on the side of the screen with the toy. Another movie was similarly designed and showed a man blowing bubbles. Differences in eye gaze patterns for toddlers with autism were observed across several movies in the app.

Eye-tracking has been used previously to assess gaze patterns in people with autism, however, this has required special equipment and expertise to analyze the gaze patterns. This app, which takes less than 10 minutes to administer and uses the front-facing camera to record the child’s behavior, only requires an iPhone or iPad, making it readily accessible to primary care clinics and useable in home settings.

“This was the technical achievement many years in the making,” Chang said. “It required our research team to design the movies in a specific way to elicit and measure the gaze patterns of attention using only a hand-held device.

“It’s amazing how far we’ve come to achieve this ability to assess eye gaze without specialized equipment, using a common device many have in their pocket,” Chang said.

To test the device, the researchers included 993 toddlers ages 16-38 months; the average age was 21 months, which is when autism spectrum disorder (ASD) is often identified. Forty of the toddlers were diagnosed with ASD using gold-standard diagnostic methods.

Dawson said ongoing validation studies are underway. Additional studies with infants as young as 6 months are investigating whether the app-based assessment could identify differences in children who are later diagnosed with autism and neurodevelopmental disorders during the first year of life.

“We hope that this technology will eventually provide greater access to autism screening, which is an essential first step to intervention. Our long-term goal is to have a well-validated, easy-to-use app that providers and caregivers can download and use, either in a regular clinic or home setting,” Dawson said. “We have additional steps to go, but this study suggests it might one day be possible.”

In addition to Dawson and Chang, study authors include J. Matias Di Martino, Rachel Aiello, Jeffrey Baker, Kimberly Carpenter, Scott Compton, Naomi Davis, Brian Eichner, Steven Espinosa, Jacqueline Flowers, Lauren Franz, Martha Gagliano, Adrianne Harris, Jill Howard, Sam Perochon, Eliana M. Perrin, Pradeep Raj, Marina Spanos, Connor Sullivan, Barbara K. Walter, Scott H. Kollins and Guillermo Sapiro.

This work was primarily supported by the National Institutes of Health, Autism Centers of Excellence Award (P50HD093074) and the National Institute of Mental Health (R01MH121329, R01MH120093). Additional support was from The Marcus Foundation, the Simons Foundation, the National Science Foundation (NSF- 1712867), the Office of Naval Research, (N00014-18-1-2143, N00014-20-1-233), the National Geospatial-Intelligence Agency (HM04761912010), Apple, Inc., Microsoft, Inc., Amazon Web Services and Google, Inc. The funders/sponsors had no role in the design and conduct of the study.

Authors Dawson, Chang, Sapiro, Baker, Carpenter, Chang, Espinosa and Harris developed technology related to the app that has been licensed to Apple, Inc., and both they and Duke University have benefited financially. Additional conflicts are disclosed in the study.

 

NOTE FOR MEDIA: Still photographs and a short video of the app are available upon request.

 

News & Media Front Page