Pamela Rollins

Dr. Pamela Rollins is working with autism experts and robotics designers to create a program that uses an artificially intelligent robot with a full range of facial expressions to interact with children who have Autism Spectrum Disorder.

Robots and humans socialize frequently in pop fiction — think of Wall-E and Star Trek: The Next Generation. Now, a UT Dallas researcher is giving the fantasy of robotic friends a practical edge with a robot that teaches social skills to children with Autism Spectrum Disorder (ASD).

Dr. Pamela Rollins, associate professor in the School of Behavioral and Brain Sciences, explained that individuals with ASD often have social anxiety. Learning social interactions via a less threatening interface — a robot — may help patients better identify emotions and use specific social skills with humans, like holding a conversation.

“Some preliminary data has shown that individuals with autism start talking to the robots when they don’t talk to other people,” Rollins said.

Rollins, who conducts research at the Callier Center for Communication Disorders, is working with a team of autism experts and robotics designers at the company Robokind to create Robots4Autism. This program uses an artificially intelligent robot with a full range of facial expressions to interact with children who have ASD.

Robot

The robot can sense when a child begins to get frustrated or agitated and can react accordingly.

According to autism advocacy group Autism Speaks, as children with ASD develop, they may have varying degrees of difficulties in engaging in normal social situations. Although they may share a connection with a person, such as a parent, they don’t show the normal behaviors that would demonstrate this affection, like hugging or smiling at them. This difficulty in demonstrating accepted social norms can continue into their adult lives.

When used in conjunction with traditional therapies, Robots4Autism may improve social behaviors and interactions for children with ASD.

“It’s not to replace therapy with humans, but you can deliver a social skills lesson in a less threatening way, and the robot can deliver the same lesson multiple times,” Rollins said.

During a lesson, the robot explains a social situation to the child with ASD. They then watch a video of the described social situation together, during which the robot comments on the appropriate behaviors displayed by the actors, reinforcing the previous explanation. As a final test, the child watches short videos of the correctly modeled behavior or one with errors, and then discusses.

The robot can sense when a child begins to get frustrated or agitated and can react accordingly. There is even a module designed to teach children how to calm themselves down when they’re agitated. It can also progress children through lessons as they master modules focusing on different social situations, such as how to greet someone or how to interact at a birthday party.

Rollins said the next step is to begin testing the effectiveness of Robots4Autism after the final programming is finished in June.

UT Dallas alumna and speech language pathologist Michelle N. McFarlin and Dr. Carolyn Garver, program director of the Autism Treatment Center, are working with Rollins to develop the curriculum for Robots4Autism. Rollins is a gubernatorial appointee to the Texas Council on Autism and Pervasive Developmental Disorders.