top of page

The Future of Robotics: Artificial Empathy

This piece was commissioned by in March 2016.

Artificial intelligence is rapidly advancing. Often taking cues from human methods of learning and problem-solving, programmers are making AI more efficient and flexible at designated tasks, such as beating a human being at the board-game Go, treating patients with lung cancer, creating a piece of music, and even writing its own machine learning programs. But, in a world where we are forming increasingly more intimate relationships with technology, AI needs to improve one of its most crucial tasks: interacting with people.

Artificial empathy is the new frontier for machines to create more meaningful relationships with its human users. If we feel like computer programs are better able to identify with our mental and emotional states, we will start seeing them less as tools and more as companions. To some, this may sound like a dystopian nightmare, but with a more intuitive, emotional role in our lives, it is only a matter of time before artificially intelligent programs become our confidants, caretakers, and even our therapists.

In general, automation occurs when humans learn how something works, break down the process into its most fundamental steps, and then teach that process to machines. But, what happens when we do not know how something works, as is the case with human empathy, and our imagination of what we think machines should be able to do surpasses what we can actually teach them?

Remarkably, the way machines are learning how to emulate one of our most human qualities is not so different from the way machines learned to automatically fill our calendars or make a cappuccino.

What is empathy?

One way of breaking down human empathy is through the following steps:

1. We sense an emotion, or recognize an emotion in another person.

2. We feel the emotion to a degree, experiencing that emotion ourselves.

3. We behave in a way that communicates to the other person that we feel and understand what they are going through.

Although this process develops naturally for humans while we are children, there is no consensus on neurobiological steps that produce emotions, or how we would recreate them for AI programs. Studies have shown that certain brain areas, such as the anterior insular cortex, become active when a person is engaged in empathic behaviors. Identifying a brain area, however, is a far cry from understanding how that brain area functions or how it communicates with the rest of the brain to produce a behavior. While research continues to shed light on how our brain produces emotion, our most human psychological processes remain mysterious emergent properties of our most complex organ.

“I can hear it in your voice”

Robotics developers are not waiting around for neuroscientists to crack the code for human emotion to tackle the problem for themselves.

Software that can recognize human emotions has gotten a lot of attention because of its potentially enormous impact. In the marketing industry, AI that recognizes your emotional response to products can also identify the ones that you are more likely to purchase. In the health industry, where the emotional burden for caretakers is disproportionately high, robots could help provide companionship to the elderly or disabled. In addition, programs such as Alexa from Amazon and Siri from Apple could vastly improve their integration and significance in our lives if they served more as companions rather than personal assistants that are often frustrating and, in the worst sense of the word, robotic. With more emotional intelligence, these programs could even have the ability to identify and counsel those suffering from mental health problems, potentially saving lives.

Using machine learning, artificial intelligence programs can track our facial expressions and cues from our voices to pick up on our emotional state. Unsurprisingly, the ability to engage with users’ moods is spawning a slew of new applications and start-ups. For example, Ellie, a program created by University of Southern California’s Institute for Creative Technologies, provides a helpful starting point for evaluating veterans returning from war for post-traumatic stress disorder, especially in cases where individuals are more likely to open up to a computer program than an actual person. An app called Cogito Companion similarly estimates mood from the tone, tension and pace of your voice, as well as how socially and physically active you are by monitoring calls, texts, and general movement, to provide data to accompany therapy and improve mental health. In the domestic gadget sphere, a device called MoodBox recognizes your mood and responds by creating the appropriate ambiance through making a selection of music and lighting.

Going beyond “I’m sorry for your loss”

In order to create the perception that users’ emotions are being understood, programs can mimic empathic behavioral responses. Drawing on one of AI’s most classic ways of emulating human intelligence, the chatbot Koko catalogs human-generated empathic responses. When people approach Koko in state of emotional discomfort, Koko redirects the user’s input anonymously to a volunteer member of the Koko team, and if the volunteer’s response receives a “thank you”, Koko adds it to a database of successful, empathic responses that can be used later on in similar scenarios. According to developers, only 1 in 10 of Koko’s responses are currently human-moderated, coming remarkably close to a machine having unique and empathic conversations with human beings.

In 2016, Koko raised $2.5 million in funding, and will likely be employed in Alexa and Siri in the near future. In conjunction with emotion recognition software, your car or your smartphone will soon be able to sense when something is wrong, ask you how you are feeling, respond empathically and try to cheer you up.

Can robots really have empathy?

Robotic imitation of human empathy leads to a lot of questions. Without experiencing emotion, is robotic empathy a form of deceit? Does it matter if robots are experiencing “real emotions” for them to be effective companions or caretakers? Could a mechanism be developed for AI to experience its own emotions? If machines respond to emotions just as convincingly as humans, then how can we be sure robots have not already achieved the capacity for emotion?

As artificial empathy becomes a more tangible reality, we have to consider how our emotional connection with technology will impact our society, our human relationships, and our individual psychology, for better or worse.

bottom of page