Share

Your Next Computer May Know How You Feel
Myriad Applications Seen for Accurate Emotion-Recognition Software

Friends, loved ones and pets can sense your mood almost instantly – and one day your computer may be able to do so pretty quickly as well.

UT Dallas computer scientist Yang Liu has received a three-year, $350,000 grant from the highly competitive Air Force Office of Scientific Research’s Young Investigator Research Program to explore emotion recognition and modeling in speech processing.

“The next-generation human-computer interaction interfaces will be more human-centered and socially intelligent,” Liu said. “They’ll have the ability to detect changes in the user’s affective behavior and thus initiate interactions accordingly. Automatic recognition of emotion plays an important role in developing future intelligent systems.”

Emotion is associated with various physical indicators, including facial expression, posture, tone of voice, word usage and movement. Liu and a team of graduate students will focus primarily on emotion recognition and modeling in speech.

They’ll study features such as pitch, intonation patterns and word usage and then associate those with emotions such as anger, sadness, happiness, surprise and frustration. Other efforts to gauge emotion from speech have achieved an accuracy rate of 60 to 80 percent. Liu hopes to improve upon those numbers.

“Automatic recognition of emotions with high accuracy still remains an elusive goal,” she said.

But her research adds a cultural component.

“We’re interested in studying the cross-lingual aspects of emotion in English and other languages, such as Chinese,” Liu said. “This way we can look for the influence of culture and language in emotions.”

She’s doing the research in collaboration with several other UT Dallas faculty who are working in similar areas.

The research could drive a virtually unlimited range of applications. A tutoring system, for example, could detect frustration or boredom in a student – a sure sign the student is not learning and a different approach is needed – perhaps triggering the application to slow down the lesson or load a different one. An interactive voice-response system that detects anger or frustration in a customer might transfer that person to a human operator. An emotion component could be added to a polygraph or lie-detector system used by law enforcement. And such technology could assist in non-pharmacological treatment of social anxiety disorders.

Liu first became interested in speech and language processing as an electrical engineering undergrad at Tsinghua University in Beijing. She joined UT Dallas in 2005 as an assistant professor in the Erik Jonsson School of Engineering and Computer Science after completing postdoctoral work at the International Computer Science Institute (where she also conducted most of her PhD research) in Berkeley, Calif. She received her PhD in electrical and computer engineering from Purdue University in 2004. Her other research interests include speech summarization of meetings, spoken dialogue systems, natural language processing, and machine learning and data mining.

The Air Force Office of Scientific Research’s Young Investigator Research Program supports scientists and engineers who have received a PhD or equivalent degree in the last five years and show exceptional ability and promise for conducting basic research.

“The next-generation human-computer interaction interfaces will be more human-centered and socially intelligent,” says UT Dallas computer scientist Yang Liu.  Her work is advancing with the help of a grant from the Air Force Office of Scientific Research.

Media Contact: The Office of Media Relations, UT Dallas, (972) 883-2155, [email protected].

Tagged: ECS research