Blog

Sep 23, 2015

The Emotional Era of Artificial Intelligence

Posted by in categories: innovation, robotics/AI

Have you hugged or told someone that you love them today? Maybe it wasn’t someone — maybe it was your smartphone that you gave an extra squeeze or gave an extra pat as you slipped it into your pocket. Humans have become increasingly invested in their devices, and a new era of emotional attachment to our devices and other AI seems to be upon us. But how does this work itself out on the other end — will or could AI ever respond to humans in an emotional fashion?

Communication Sparks Emotional Response

AI is broad, and clearly not all AI are meant to give and receive in an emotional capacity. Humans seem prone to respond to features that are similar to its own species, or to those to which it can relate to in some sort of communicative way. Most “emotional” or responsive algorithm-based capabilities have been programmed into robots that are in a humanoid – or at least a mammal-like – form.

Think androids in customer-service, entertainment, or companion-type roles. There are also robots like PARO, the baby harbor seal used for therapeutic interaction with those in assisted living and hospital environments.

In a 2003 paper published through the International Journal of Human-Computer Studies, Cynthia Breazeal quotes a study by Reeves and Nass (1996), whose research shows humans (whether computer experts, lay people, or computer critics) generally treat computers as they might treat other people.

Breazeal goes on to state that humanoid robots (and animated software agents) are particularly relevant, as a similar morphology promotes an intuitive bond based on similar communication modes, such as facial expression, body posture, gesture, gaze direction, and voice.

An Emotional Model for AI

This in and of itself may not be a complete revelation, but how you get a robot to accomplish such emotional responses is far more complicated. When the Hanson Robotics’ team programs responses, a key objective is to build robots that are expressive and lifelike so that people can interact and feel comfortable with the emotional responses that they are receiving from a robot.

In the realm of emotions, there is a difference between robot ‘responses’ and robot ‘propensities’. Stephan Vladimir Bugaj, Creative Director at Hanson Robotics, separated the two during an interview with TechEmergence. “Propensities are much more interesting and are definitely more of the direction we’re going in the immediate long-term”, he says.

“An emotional model for a robot would be more along the lines of weighted sets of possible response spaces that the robot can go into based on a stimulus and choose a means of expression within that emotional space based on a bunch of factors.” In other words, a robot with propensities would consider a set of questions, such as “What do I think of the person? How did it act in the last minute? How am I feeling today?”. This how most humans function through reason, though it happens so habitually and quickly in the subconscious that we are hardly aware of the process.

Context of immediate stimulus would provide an emotional frame, allowing a robot to have a more complex response to each stimulus. The use of short-term memory would help the robot build a longer-term emotional model. “You think of it as layers, you can think of it as interconnected networks of weighted responses…as collections of neurons, there’s a lot of different ways of looking at it, but it basically comes down to stages of filtering and considering stimuli, starting with the input filter at the perceptual level.”

Similar to a human being, robots could have more than one response to a stimulus. An initial reaction or reflex might quickly give way to a more “considered response”, cause by stored and shared information in a neural-like network. Stephan describes a hypothetical scene in which a friend enters a room and begins taking swings at his or her friend. At first, the friend who is on the defense might react by immediately assuming a fighting stance; however, it might only take a few seconds for him or to realize that the other person is actually just “horsing around” and being a bit of an antagonist for sport.

This string of events provides a simple way to visualize emotional stages of reaction. Perception, context, and analysis all play a part in the responses of a complex entity, including advanced robots. Robots with such potential complex emotional models seem different from AI entities programmed to respond to human emotions.

The Beginnings of Responsive Robots

These AI don’t necessarily need to take a human-like form (I’m thinking of the movie Her), as long as they can communicate in a language that humans understand. In the past few years, innovators have started to hit the IndieGogo market with domestic social robots such as Jibo and EmoSPARK, meant to enhance human wellbeing through intelligent response capabilities.

Patrick Levy Rosenthal, founder of EmoSpace, envisioned a device that connects to the various electronic objects in our homes, able to adjust their function to positively affect our emotional state. “For the last 20 years, I believe that robotics and artificial intelligence failed humans…we still see them as a bunch of silicon… we know that they don’t understand what we feel.”

Rosenthal set out to change this perception with EmoSPARK, a cube-like AI that calibrates with other objects in the user’s home, such as an mp3 music player. The device, according to Rosenthal, tracks over 180 points on a person’s face, as well as the relation between those points – if you’re smiling, your lips will be stretched and eyes more narrow. The device also detects movement and voice tonality for reading emotional cues. It can then respond to those cues with spoken prompts and suggestions for improving mood – for example, asking if its human user needs to hear a joke or a favorite song; it can also respond to and process spoken commands.

While robots that respond to humans’ emotionally-based states and requests may soon be available to the masses, robots that have their own emotional models – that can “laugh and cry” autonomously, so to speak – are still out of reach, for the time being.

Comments are closed.