How machines are learning to read your mood

Can artificial intelligence be emotionally intelligent? In Boston, researchers have programed BB-8, the little droid from “Star Wars: The Force Awakens,” to detect expressions and determine how people are feeling. And that technology is being adapted for marketing, video games, even therapy for children diagnosed with autism. The NewsHour’s April Brown reports.

Read the Full Transcript

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.

  • GWEN IFILL:

    Now: developing technology that can better identify your own emotions.

    At a time when people are concerned about what data can track and how it can be sold, it is an advance that clearly raises concerns. But it may also yield some important benefits.

    The "NewsHour"'s April Brown takes a look, part of our weekly series on the Leading Edge of science and technology.

    DAN MCDUFF, Director of Research, Affectiva: You can control the movements of BB-8, the little droid, based on how your facial expressions are changing.

  • APRIL BROWN:

    If this little droid looks familiar, you may have seen him in the most recent Star Wars film. BB-8 moved on his own on the fictional planet Jakku, but here in Boston, Dan McDuff is in charge.

  • DAN MCDUFF:

    It's going to detect my face, and then start to be able to control the robot by making different facial expressions.

  • APRIL BROWN:

    He's taken off. He's not pleased with you.

    The software that allows him to control BB-8 is called Affdex, and McDuff is the director of research for Affectiva, the company that created it.

    But Affdex can do much more than make robots move by making faces. It can also detect expressions to help determine how people are feeling.

  • DAN MCDUFF:

    So, we're using the camera feed to detect where your face is, track the feature points, and identify texture changes on your skin. And when you furrow your brow or smile, there are these distinctive patterns that the computer can recognize.

    And it's actually been trained on hundreds of thousands of people, so it's seen examples of smiles and brow furrows and frowns from many different people.

  • RANA EL KALIOUBY, CEO, Affectiva:

    So, so far, we have collected over 50 billion emotion data points.

  • APRIL BROWN:

    Rana El Kaliouby is Affectiva's CEO, and the brain behind the artificial intelligence software.

  • RANA EL KALIOUBY:

    I realized that we were spending so much time with our devices and our technology, yet it had absolutely no idea how we felt or what our mental state was.

  • APRIL BROWN:

    How do you know that this is accurate?

  • RANA EL KALIOUBY:

    Validation is a — we take that very seriously at Affectiva. We have a team of labelers based in Cairo, and they are certified facial action coders. And so we compare the accuracy of our machine-learning algorithms to the accuracy of these human experts.

    On some of these emotional states, we're approaching expert human accuracy very, very fast.

  • APRIL BROWN:

    Which ones?

  • RANA EL KALIOUBY:

    So for example, with the smiles, with the eyebrow furrows and the eyebrow raises, it's extremely accurate. It gets a little trickier when you look at, you know, lip puckers or lip purses or, you know, squints.

    Emotional intelligence is completely missing from our digital world, and so now there is a lot more understanding of why emotions are important and there — you know, there is a lot more understanding of how this type of technology can disrupt and transform a lot of industries.

  • APRIL BROWN:

    The media and advertising industries quickly saw the value.

  • RANA EL KALIOUBY:

    We have partnered with 1,400 brands. We test ads in 75 countries around the world.

    You want to know if people are resonating with your ad before it goes live and before you spend millions and millions of dollars. With our software, you can get a moment-by-moment readout of a viewer's emotional journey.

  • PRESIDENT BARACK OBAMA:

    The nature of our military has changed.

  • APRIL BROWN:

    Like Affectiva itself did with a test to find out how well the candidates were connecting with viewers in one of the debates for the 2012 presidential election.

    Many other companies have found their own applications for the emotional recognition technology.

  • ERIN REYNOLDS, CEO, Flying Mollusk:

    We call Nevermind a biofeedback enhanced adventure thriller game. Say that five times fast.

  • APRIL BROWN:

    Erin Reynolds is CEO of Flying Mollusk, a company that created the video game Nevermind.

  • ERIN REYNOLDS:

    Nevermind is a very creepy game, a very dark, surreal game. Often, the player will be a little stressed, a little scared, a little anxious when they are playing it. So you as a player have to learn how to stay calm under the pressure.

    The more scared or stressed you get, the faster it goes, so it becomes a lot harder to grab the hand.

  • APRIL BROWN:

    While Nevermind's purpose is entertainment-only right now, Reynolds hopes the game may eventually help people with post-traumatic stress disorder or anxiety. But the Affdex software is already being used in the medical field to try and improve lives.

  • NED SAHIN, CEO, Brain Power:

    What do you see on screen?

    MATTHEW KRIEGER, 8 Years Old: Mom.

  • APRIL BROWN:

    Eight-year-old Matthew Krieger has been diagnosed with autism.

    LAURA KRIEGER, Matthew's mother: A lot of the trouble he gets into with other kids is, he thinks he's funny and doesn't read at all that he is not or that there are annoyed or angry.

  • APRIL BROWN:

    Matthew's mother, Laura, signed him up for a clinical trial being conducted by Ned Sahin.

  • NED SAHIN:

    I want to know what's going on inside the brain of someone with autism. And it turns out parents want to know that, too.

    You get points for looking for a while, and then even for looking away and then looking back.

  • APRIL BROWN:

    Sahin's company, Brain Power, uses Affectiva's software programs Matthew sees through Google Glass. These games are trying to help him understand how facial expressions correspond to emotions and learn social cues.

  • NED SAHIN:

    One of the key life skills is understanding the emotions of others. And another is looking in their direction when they are speaking.

    Looking at your mom, and while it's green, you're getting points. When it starts to get orange and red, you're — you slow down with the points.

  • MATTHEW KRIEGER:

    Am I looking at you?

  • LAURA KRIEGER:

    You are looking at me. Are you looking at me? Do you think you are?

  • MATTHEW KRIEGER:

    Yes.

  • LAURA KRIEGER:

    Yes.

  • MATTHEW KRIEGER:

    Well, I don't know if I am.

  • LAURA KRIEGER:

    You don't know?

  • MATTHEW KRIEGER:

    Because I found out that, if I look this way, it doesn't count. But if I look through — if I look at the screen at you, I can still see your eyes, and it gives me points.

  • LAURA KRIEGER:

    So — but if you tilt your head up and look under the screen, you feel like you are looking at me?

  • MATTHEW KRIEGER:

    Well, I feel like I am both.

  • APRIL BROWN:

    Just a few minutes later, the difference in Matthew's gaze overwhelmed his mother.

  • LAURA KRIEGER:

    I'm going to cry.

  • MATTHEW KRIEGER:

    Why?

  • LAURA KRIEGER:

    When you look at me, it makes me think you haven't really before, because you're looking at me differently.

  • NED SAHIN:

    The brain learns very well by feedback. We don't know for sure yet, but we're going to find out if the kind of feedback we are giving can help people teach themselves these skills.

  • APRIL BROWN:

    After the testing, Laura Krieger reflected on what happened.

  • LAURA KRIEGER:

    It's such a difference. And I have heard him even tell the psychologist when she works on eye contact, he says: "Well, I have a trick. I look at your forehead."

    And that's probably what people see in passing, is that counts for them as eye contact. And it did for me. But this was really different. Like, this was looking at me.

  • APRIL BROWN:

    Allowing a computer to capture and read your emotions for research is one thing, but Affectiva's Rana El Kaliouby says her company is concerned about how this sensitive material is used in all applications.

  • RANA EL KALIOUBY:

    We recognize that your emotional information is extremely personal. And so we have veered away from all use cases where that data is being collected without your consent.

  • APRIL BROWN:

    Is there not a slippery slope? If you are giving the computer the ability to recognize your emotions, is that not one step further towards something that could potentially do something dangerous?

  • RANA EL KALIOUBY:

    I think these conversations are very important. I personally believe we are a long way away from that scenario.

  • APRIL BROWN:

    Still, El Kaliouby sees her artificial intelligence work becoming ubiquitous in the years to come.

  • RANA EL KALIOUBY:

    Fast-forward three to five years. We think our devices and our technologies will all have an emotion chip, so pretty much like our devices have a GPS or location-enabled apps today.

    Apparently, we check our phones on an average 15 times an hour, and so you can imagine that being an emotion data point check-in. And you can track a person's mood.

  • APRIL BROWN:

    And eventually, she hopes, it will help our devices convey more emotion than we can with the current technology.

    For the "PBS NewsHour," I'm April Brown in Boston.

Listen to this Segment