The Assault on Empathy

This article is part of our special issue “Connected State of Mind,” which explores the impact of tech use on our behavior and relationships. View the complete issue here.

One morning this past November, with my daughter about to visit and holiday gifts on my mind, I opened my MIT mail and found an intriguing “call for subjects.” It described a research project that promised to use conversation with sociable robots, some of them designed to be children’s playthings (and indeed, marketed as holiday gifts), as a step towards “eliciting empathy.” There it was in black and white, the thing that has been unfolding for decades: The robot presented as empathy machine—an object that presents itself as worthy of your empathic response, and as having an empathic reaction to you. But objects can’t do this. They have not known the arc of a human life. They cannot put themselves in our place. They can only simulate what a human being might say. They feel nothing of the human loss or love or trouble we are describing to them. Or that they are describing to us.

How did we get to this place of no meaning? In small steps. We’ve talked to Siri and Alexa and Echo about recipes and playlists and pizzas for years and have taken pleasure in extending the conversation to more companionate territory: our dates, our dreams, our troubles with friends and families. So, now, the gates are open to therapy-bots, and nanny-bots, and elder-care bots of all kinds.

Indeed, this year on the cover of Time magazine’s annual “inventions of the year” issue is Jibo, a sociable robot that responds to “Hey, Jibo,” and, according to its programmers, aspires to be a family friend. If Jibo successfully passes into the friendship realm, it also simultaneously passes an emotional version of the Turing Test, which was designed to measure machine intelligence by determining whether a computer could trick us into thinking it was a person. If it could, we declared it intelligent. But now, with toys for children that declare their love and want to chat, we bring to life a longstanding fantasy that machines might be our companions, that they might seem to not only be smart, but also to care about us.

With toys for children that declare their love and want to chat, we bring to life a longstanding fantasy that machines might be our companions, that they might seem to not only be smart, but also to care about us.

But there, we run into a problem: Simulated thinking may be thinking, but simulated feelings are not feelings, and simulated love is never love. Our “success” in making robots that pretend to empathize involves deception with significant consequence. Children can learn chess from a chess playing machine; they cannot “learn empathy” from machines that have none to give. On the contrary. They can learn something superficial and think it is true connection.

This empathic deficit is not something that robots can grow out of as they grow more advanced. Human empathy depends on human embodiment and experiencing a human life. A program doesn’t know what it feels like to need to be fed and kept warm, to have parents, to fear hurt, illness, loneliness, or death. I have been in the room when older people who were given sociable robots with “seeming empathy” went on to talk to the robots about what it felt like to lose a child. Roboticists celebrate such moments. But we forget that in these “conversations,” no one is listening. We diminish who we are as people by accepting as empathy what the machine is able to give.

The healing that follows from an empathic encounter comes from the promise of trying to understand and of following through on the responsibilities of that understanding. Simulation can never deliver on this promise. When we offer simulated companions to our elders, we break a generational compact that we will be there for each other. When we offer sociable toys and digital pets to our children, we embark on an experiment in which our children are the human subjects. Will we be honest enough to confront the emotional downside of living out our robot dreams?

Sociable robots are only a small part of our crisis of empathy. Recently, I worked with a middle school in upstate New York, asked to consult with its faculty about what they see as a disturbance in their students’ friendship patterns. In her invitation, the dean put it this way: “Students don’t seem to be making friendships as before. They make acquaintances, but their connections seem superficial.”

The case of the superficial acquaintances in middle school was compelling. What I was hearing made sense in light of other things I was hearing in other schools, from other teachers, and from older students. And so, it was decided that I would join the Holbrooke teachers on a faculty retreat. I brought along a new notebook; after an hour, I wrote on its cover, “The Empathy Diaries.”

For that was what was on the minds of the Holbrooke teachers. Children at Holbrooke were not developing empathy in the way that years of teaching had suggested they would. Ava Reade, the dean of the school, says that she rarely intervenes in student social arrangements, but recently, she had to. A seventh grader tried to exclude a classmate from a school social event. Reade called the remiss seventh grader into her office and asked why it happened. The girl didn’t have much to say:

[The seventh grader] was almost robotic in her response. She said, “I don’t have feelings about this.” She couldn’t read the signals that the other student was hurt.

Here is what [Reade] said:

“These kids aren’t cruel. But they are not emotionally developed. Twelve-year-olds play on the playground like eight-year-olds. The way they exclude each other is the way eight-year-olds would play. They don’t seem able to put themselves in each other’s place. They say to other students: “You can’t play with us.”

“They are not developing that way of relating where they listen and learn how to look at each other and hear each other.”

The Holbrooke teachers are enthusiastic users of educational technology. But they believe that they see trouble on its way. And more than this, indications that harm may have already occurred. It is a struggle to get children to talk to each other in class, to directly address and debate each other. It is a struggle to get them to meet with school faculty. One teacher observes: “[The students] sit in the dining hall and look at their phones. When they share things together, what they are sharing is what is on their phones.” Is this the new conversation? If so, it is not doing the work of the old conversation. As these teachers see it, the old conversation taught empathy. These students seem to understand each other less.

Will we be honest enough to confront the emotional downside of living out our robot dreams?

And indeed, studies of college-aged children show what might be called an empathy gap. In 2010, a team at the University of Michigan led by the psychologist Sara Konrath put together the findings of 72 studies that were conducted over a 30-year period. They found a 40 percent decline in empathy among college students, with most of the decline taking place after 2000. The researchers were led to conclude that the decline was due to the presence of mobile technology. Young people were taking their eyes off each other and onto their phones.

I was invited to Holbrooke because for many decades I have studied children’s development in technological culture. I began in the late 1970s when a few schools were experimenting with personal computers in their classrooms. I work on this question still, when many children come to school with a tablet or laptop of their own, or one their school has issued.

From the beginning, I found that children used the digital world to play with issues of identity. In the late 1970s and early 1980s, children used simple programming as an expressive media. A thirteen-year-old who had programmed a graphical world of her own said: “When you program a computer, you put a little piece of your mind into the computer’s mind and you come to see yourself differently.” Later, when personal computers became portals to online games, children experimented with identity by building avatars. The particulars changed with new games and new computers, but something essential remained constant: Virtual space is a place to explore the self.

Also constant across all these years was the anxiety of adults around children and machines. From the beginning, teachers and parents worried that computers were too compelling. They watched, unhappy, as children became lost in games and forgot about the people around them, preferring, at long stretches, the worlds in the machine.

One sixteen-year-old describes this refuge: “On computers, if things are unpredictable, it’s in a predictable way.” Programmable worlds can be made exciting, but they offered a feeling of control that some began to call “friction-free.”

We want to believe that if technology has created a problem, technology will solve it.

From the early days, I saw that computers offered the illusion of companionship without the demands of friendship and then, as the programs got really good, the illusion of friendship without the demands of intimacy. Because, face-to-face, people ask for things that computers never do. With people, things go best if you pay close attention and know how to put yourself in someone else’s shoes. Real people demand responses to what they are feeling in that moment. And not just any response.

Time in simulation gets children ready for more time in simulation. Time with people teaches children how to be in a relationship, beginning with the ability to have a conversation. And this brings me back to the anxieties of the Holbrooke teachers. As the teachers saw it, as their students began to spend more time texting, they lost practice in face-to-face talk. That means lost practice in the empathic arts—learning to make eye contact, to listen, and to attend to others.

Mobile technology and sociable artifacts are both part of a current assault on empathy. Ironically, some hope that technology will also help restore the empathy we may have lost. Take, for instance, the author of the study that suggested that mobile technology is a major force in the suppression of empathy since most of the decline in empathy scores was in the final decade of the study.  Depressed by the result, she set herself to building empathy apps for the iPhone. It is easy to be sympathetic to this response. We want to believe that if technology has created a problem, technology will solve it. But in this case, when our thoughts turn to emotive robots or iPhone apps, we are forgetting the essential. We are the empathy app. People, not machines, talking to each other. Technology can make us forget what we know about life. It is not too late to remember, to look up, look at each other, and start the conversation.

Adapted from Reclaiming Conversation: The Power of Talk in a Digital Age. Copyright © 2015 by Sherry Turkle.