Robots Reading Feelings
April 26, 2019 | Case Western Reserve UniversityEstimated reading time: 3 minutes
Researchers boast 98% accuracy for robots recognizing facial cues; could improve video gaming today, health care tomorrow.
Robots are getting smarter—and faster—at knowing what humans are feeling and thinking just by “looking” into their faces, a development that might one day allow more emotionally perceptive machines to detect changes in a person’s health or mental state.
Researchers at Case Western Reserve University say they’re improving the artificial intelligence (AI) now powering interactive video games and which will soon enhance the next generation of personalized robots likely to coexist alongside humans.
And the Case Western Reserve robots are doing it in real time.
New machines developed by Kiju Lee, the Nord Distinguished Assistant Professor in mechanical and aerospace engineering at the Case School of Engineering, and graduate student Xiao Liu, are correctly identifying human emotions from facial expressions 98% of the time—almost instantly. Previous results from other researchers had achieved similar results, but the robots often responded too slowly.
“Even a three-second pause can be awkward,” Lee said. “It’s hard enough for humans—and even harder for robots—to figure out what someone feels based solely on their facial expressions or body language. All the layers and layers of technology —including video capture—to do this also unfortunately slows down the response.”
Lee and Liu accelerated the response time by combining two pre-processing video filters to another pair of existing programs to help the robot classify emotions based on more than 3,500 variations in human facial expression.
But that’s hardly the extent of our facial variation: Humans can register more than 10,000 expressions, and each also has a unique way of revealing many of those emotions, Lee said.
But “deep-learning” computers can process vast amounts of information once those data are entered into the software and classified.
And, thankfully, the most common expressive features among humans are easily divided into seven emotions: neutral, happiness, anger, sadness, disgust, surprise and fear—even accounting for variations among different backgrounds and cultures.
Lee presents to a crowd at the opening the new Smart Living Lab at Ohio Living Breckenridge Village in Willoughby, Ohio.
This recent work by Lee and Liu, unveiled at the 2018 IEEE Games, Entertainment, and Media Conference, could lead to a host of applications when combined with advances by dozens of other researchers in the AI field, Lee said.
The two are also now working on another machine-learning based approach for facial emotion recognition, which so far has achieved over 99-percent of accuracy with even higher computational efficiency.
Someday, a personal robot may be able to accurately notice significant changes in a person through daily interaction—even to the point of detecting early signs of depression, for example.
“The robot could be programmed to catch it early and help with simple interventions, like music and video, for people in need of social therapies,” Lee said. “This could be very helpful for older adults who might be suffering from depression or personality changes associated with aging.”
Lee is planning to explore the potential use of social robots for social and emotional intervention in older adults through collaboration with Ohio Living Breckenridge Village. Senior residents there are expected to interact with a user-friendly, socially interactive robot and help test accuracy and reliability of the embedded algorithms.
Another Future Possibility
A social robot who learns the more-subtle facial changes in someone on the autism spectrum—and which helps “teach” humans to accurately recognize emotions in each other.
“These social robots will take some time to catch in the U.S.,” Lee said. “But in places like Japan, where there is a strong culture around robots, this is already beginning to happen. In any case, our future will be side-by-side with emotionally intelligent robots.”
Suggested Items
KIC’s Miles Moreau to Present Profiling Basics and Best Practices at SMTA Wisconsin Chapter PCBA Profile Workshop
01/25/2024 | KICKIC, a renowned pioneer in thermal process and temperature measurement solutions for electronics manufacturing, announces that Miles Moreau, General Manager, will be a featured speaker at the SMTA Wisconsin Chapter In-Person PCBA Profile Workshop.
The Drive Toward UHDI and Substrates
09/20/2023 | I-Connect007 Editorial TeamPanasonic’s Darren Hitchcock spoke with the I-Connect007 Editorial Team on the complexities of moving toward ultra HDI manufacturing. As we learn in this conversation, the number of shifting constraints relative to traditional PCB fabrication is quite large and can sometimes conflict with each other.
Standard Of Excellence: The Products of the Future
09/19/2023 | Anaya Vardya -- Column: Standard of ExcellenceIn my last column, I discussed cutting-edge innovations in printed circuit board technology, focusing on innovative trends in ultra HDI, embedded passives and components, green PCBs, and advanced substrate materials. This month, I’m following up with the products these new PCB technologies are destined for. Why do we need all these new technologies?
Experience ViTrox's State-of-the-Art Offerings at SMTA Guadalajara 2023 Presented by Sales Channel Partner—SMTo Engineering
09/18/2023 | ViTroxViTrox, which aims to be the world’s most trusted technology company, is excited to announce that our trusted Sales Channel Partner (SCP) in Mexico, SMTo Engineering, S.A. de C.V., will be participating in SMTA Guadalajara Expo & Tech Forum. They will be exhibiting in Booth #911 from the 25th to the 26th of October 2023, at the Expo Guadalajara in Jalisco, Mexico.
Intel Unveils Industry-Leading Glass Substrates to Meet Demand for More Powerful Compute
09/18/2023 | IntelIntel announced one of the industry’s first glass substrates for next-generation advanced packaging, planned for the latter part of this decade.