Emotion Reading Technology on the Horizon

Emotion Reading Technology on the Horizon

⯀ Emotion recognition technology is about much more than machine learning looking at an image of your face. We can learn a great deal about a person’s internal state by pairing all kinds of sensors with machine learning.

Poppy Crum, posed a provocative question at the recent TED Conference: Do we actually possess control over what other people see, know and understand about us?

For many it is a scary thought that machines may start to know knowing how you’re feeling without you wanting them to.

Crum is the chief scientist at Dolby Laboratories tells of a fast-coming time when technology will see right through people no matter how hard they try to hide their feelings.

Technology such as sensors combined with artificial intelligence can reveal whether someone is lying, infatuated or poised for violence, Crum detailed onstage.

"It is the end of the poker face. We broadcast our emotions. We will know more about each other than we ever have."
“It is the end of the poker face,” Crum said. “We broadcast our emotions. We will know more about each other than we ever have.”

For instance, eye dilation reveals how hard a brain is working, and heat radiating from our skin signals whether we are stressed or even having romantic notions.




The amount of carbon dioxide exhaled can signal how riled up someone, or a crowd, is getting. Microexpressions and chemicals in breath reveal feelings.

“Today’s technology is starting to make it really easy to see the signals and tells that give us away,” she says. In her work, she’s found that we can learn a great deal about a person’s internal state by pairing sensors with machine learning.

For example, infrared thermal imaging can reveal changes in our stress level, how hard our brain is working, and whether we’re fully engaged in what we’re doing. To make her point, she gave TED attendees a quick fright by showing a startling clip from the horror film The Woman in Black. By using tubes embedded in the theater, she captured the CO2 in the room — and showed a real-time data visualization of the CO2 levels that pinpoint the moment the audience collectively jumped.



Brain waves can indicate whether someone’s attention is elsewhere in a room, regardless of the fact their gaze is locked on the person in front of them.

Technology exists to read such cues and, combined with artificial intelligence that can analyze patterns and factor in context, can magnify empathy if used for good or lead to abuses if used to oppress or manipulate, said Crum.

Dolby Laboratories chief scientist Poppy Crum. Image Source - AFP-JIJI

“It is really scary on one level, but on another level it is really powerful,” Crum said. “We can bridge the emotional divide.”

Related articles
She gave examples of a high school counselor being able to tell whether a seemingly cheery student is having a hard time, or police quickly knowing if someone acting bizarrely has a health condition or is criminally violent.

One could skip scanning profiles on dating apps and, instead, scan people for genuine interest, or artists may be able to see the emotional reactions people have to their creations.

“I realize a lot of people are having a hard time with people sharing our data, or knowing something we didn’t want to share,” Crum said. “I am not looking to create a world where our inner lives are ripped open, but I am looking to create a world where we can care about each other more effectively.”

With emotion-reading rooms, smart speakers and other accessories on their way, Crum is keen to see rules in place to make sure benefits are equally available to all while malicious uses are prevented.

“It is something people need to realize is here and is going to happen; so let’s make it happen in a way we have control over,” Crum said. “We will be able to know more about each other than we ever have. Let’s use that for the right reasons rather than the wrong ones.”


SOURCE  Japan Times


By  33rd Square




Comments

Popular Posts