Stanford - UPI
Stanford University on Monday begins the second phase of trials in an experiment to help kids with autism use Google Glass to better interpret social cues.
As children with autism can struggle to recognize facial emotions, Stanford researchers are combining wearable glasses and artificial intelligence that can read facial expressions and help the child determine what emotion is being presented. For example, the child would see images of a woman smiling and learn that means she is feeling happy.
A successful pilot study involved 40 people. The second phase will consist of at-home tests with 80 autistic children and 20 typically developing children ages 6 to 16. The progression of their behavioral recognition will be monitored over a four-month period.
The study will require the children to wear a Google Glass device three times a day for 20 minutes each time.
Results from the sessions will help researchers better understand the role visual engagement plays in the emotional learning process.
Stanford faculty from fields such as pediatrics, computer science and psychiatry are collaborating on the project, which is funded by the David Lucile & Packard Foundation, Google and Stanford Medicine.
The first phase initially had only one Google Glass set to use, but Google later donated 35 devices. The Packard Foundation also donated $379,408 in June.
The search engine giant's Glass project, sales of which were halted in 2015, is due for an upgrade aimed at enterprise users that will feature an enhanced display, Intel Atom CPU and an option for an external battery pack.