For much of her 16 years, Starr has lived a double life. At school, Starr is what she refers to as “Starr version 2.” As one of the only black students at Williamson Prep, an elite high school outside of her predominantly black neighborhood, she “doesn’t give anyone a reason to call her ghetto.” She’s not proud of it, but it helps her to fit in. On the weekends everything is different. In Garden Heights, she is able to be herself.
At a recent party in Garden Heights, she happened to run into her best friend from when she was a kid, Khalil. Just as they started to discuss old times, gunshots rang out, and everyone began to run. Khalil and Starr ran for Khalil’s car.
As they drove home, Khalil pulled over so that he could look Starr in the eyes while he explained to her the meaning of THUG LIFE (The Hate U Give Little Infants Fucks Everybody—a reference to Tupac Shakur’s tattoo of the same acronym, which he said represented the idea that when children are exposed to violence, that violence finds its way back into society when they become adults). They also reminisced about the games they played as children and the first kiss they shared; they leaned in to kiss again. They started driving, but got less than a hundred feet before police lights flashed behind them. Khalil pulled over, and within moments a white police officer was standing over them. They braced for what was to come. From a young age they had been told that getting stopped by the police is inevitable, and that when it happens, you have to behave a certain way to keep safe.
Khalil asked why he was stopped and argued that he didn’t have to turn the music off since he could hear the officer over the music. Immediately things began to escalate. First the officer yelled at Khalil to get out of the car, threatening to do it for him if he didn’t do it himself. Then, as the officer patted Khalil down, he ask questions such as, “Where did you find her?” and “You looking for a score tonight?”
When the officer went back to his cruiser, Khalil leaned into the car to check on Starr. That’s when it happened—the officer shot him. Starr immediately jumped out of the car to check on her friend, but the officer placed her in handcuffs and began radioing for backup, referring to both Starr and Khalil as suspects. He thought he’d seen a gun in Khalil’s hand. Frantically, he began to look for it. The object in question was a hairbrush.
This is a scene out of the 2018 film The Hate U Give, based on the novel of same name by Angie Thomas. But, as the cases of Philando Castile, Michael Brown, and Sandra Bland show, it might just as easily have happened in real life. According to the organization Mapping Police Violence, in 2015 unarmed blacks were killed by police at five times the rate of unarmed whites.
What was it about Khalil that made the police assume that he was looking for drugs? Or that the thing in his hand was a gun? Was it the color of his skin? More broadly, why is it that when we see certain people, we automatically form expectations about them?
According to Stanford University professor of psychology and MacArthur Fellow Jennifer Eberhardt, it comes down to implicit bias, which she defines as “the beliefs and the feelings we have about social groups that can influence our decision making and our actions, even when we’re not aware of it.”
Drawing on her nearly three decades of work on bias as a researcher and advisor to police forces across the country, Eberhardt has penned a new book, Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do.
I recently had the opportunity to speak with Eberhardt about her work. We discussed how implicit biases are formed, why shows like Grey’s Anatomy and CSI can increase negative biases, how a popular neighborhood app reduced bias by 75 percent, and why body-worn cameras may actually increase the divide between police officers and the communities they work in.
She also stressed that “you don’t have to be a bigot to be biased. You don’t have to be a bad person. You can have these biases that are triggered that can have real devastating impacts despite your intentions and despite your desires.” What is important, she argued, is being conscious of the circumstances that make us vulnerable to them, and working to mitigate those factors. The following is our conversation edited for length and clarity.
Ilaria Schlitz: In your book you write about a German man who was working in the United States as a police officer. He came to speak to you after one of your trainings and told you about an interesting and unsettling experience. What does his story tell us about how biases are formed?
Jennifer Eberhardt: He had just been in the country for a couple years, and he decided to join the police department. He was in a city that was pretty diverse and there was a strong connection there between race and crime, and African American men in particular. This was something that he never really thought much about in Germany. It was coming to the States that he was exposed to this and had to confront this. He could see it really changing how he thought, who he would pay attention to, who he was fearful of, who he thought he needed to be careful around, all of those things. Even when he wasn’t intending to give additional scrutiny to African American men, he found himself checking these men out without consciously thinking about it, but just having learned the association. He would catch himself later. It was quite interesting because he described it as though his mind started making these associations against his will and out of his control, but he could see the power of it developing.
Is it possible to avoid forming these biases? How do we use our positive explicit values to combat the negative implicit biases that we develop just from being in the world that we live in?
There are these biases that we learn from an early age. It’s something that you can absorb fairly quickly and, in the police officer’s case, practice over and over again, to the point where it becomes more automatic and less controllable.
We’re all learning these biases, even the people who are affected by them. So black men, for example, would show the same kind of implicit bias that white men or white people would show, even when the bias is about them. A lot of that learning has to do with growing up in the culture that emphasizes that.
You don’t have to be a bigot to be biased.
I tell the story of one of my sons, when he was a first grader and was picking up on this idea that black people were seen differently, and he didn’t quite know what it was. I asked him to think about a situation where he felt that was the case and he said, “Oh, when we were at the grocery store I saw a black man come in.” We were living in a neighborhood at the time that was mostly white, so having a black man come in was not the normal thing that would happen all the time. My son noticed how people stood a little further away from the man, and then he was telling me how when this black man got into the line to make a purchase, his line was the shortest line for a little bit.
My son saw all this as a first grader, and I asked him what he thought it meant. It took him a while to think about it and come to it, but he figured it out fairly quickly. He looked up at me and he said, “I think it’s fear.” I didn’t have to tell him what the stereotype was, I didn’t have to express that stereotype myself. Just watching how other people move through the world and how we treat one another, you can come to understand what that person’s place is in society and pick up on these associations that are out there, and even start to enact them yourself.
You’ve spent much of your career working with police departments across the country. Have you noticed a change in attitude and or receptiveness over the years?
Initially, the idea of looking at unconscious bias or implicit bias was appealing to a lot of police departments. Especially people who were on the executive team, the chiefs and assistant chiefs and so forth, because it allowed them to look at an issue that they thought was important, that was affecting members of their department, without having to go through a lawsuit and without having to be formally monitored.
When the stakes are high, it puts the department in a situation where they feel more vulnerable. They were worried about the role that race could play in decision-making in policing, and so it seemed like a safe way to try to understand this and then be able to work against it. I think sometimes police officers feel like they’re being vilified, that they’re seen as racist and bad actors. Talking about implicit bias was appealing for that reason. You didn’t have to accept yourself as a villain or as a monster to actually understand and admit that you could have these implicit biases. There was a window in where officers would actually listen and could hear what you were saying and could think about it and learn from it.
More recently, the climate has changed, and things are a lot more polarized than they used to be. Even the way that people use the term implicit bias has changed. Oftentimes, they use it as though they’re talking about people who are bigoted, people who are what researchers would consider to be old-fashioned racist. And so there’s been a blurring of the term. Many officers feel like this implicit bias is just a new way of calling someone a racist, which is actually not how researchers use it. But maybe in some spaces it is starting to get used that way. Now there’s more of a reaction against it than there was before. There’s more tension when you talk about it. There’s sometimes less openness to think about the issues and how it might affect police work.
Do you think that the increased use of social media and phone cameras has affected these attitudes.
I think it’s affected attitudes in a way where we’re more aware of things that are happening on the ground than we used to be. But also it leads to more polarization. We can look at the same footage today from body-worn cameras and see different things. Police officers could see and understand why an officer might have taken this action and not this other action, whereas a community member might see the same footage and feel otherwise, horrified by what they see and so forth. In some ways, in certain circumstances, the footage doesn’t bring us closer together, but it can move us further apart.
I’ve actually used footage in my own work along with a number of other Stanford researchers. We’ve been using it to look at everyday, routine interactions of the police with the community, and trying to understand how those interactions unfold. We’ve done some work looking at the language that officers use in routine traffic stops with black versus white drivers. We’re finding that the officers are professional overall, but they speak less respectfully to black drivers than to white drivers. The language that you use can either inspire trust in the law and trust in you as an officer or it can lead to the opposite. And so it’s a place where people start to form opinions about the police and also pick up some understanding about how the police feel about them.
Another example you bring up in your book is research on shows, such as Grey’s Anatomy and CSI, that reveals how even positive portrayals of black individuals can lead to greater negative bias. Could you go into that study and explain why that is the case?
This is recent research that was conducted by Max Weisbuch, Kristin Pauker, and Nalini Ambady at Tufts. They were interested in how subtle bias can get communicated on the TV set with other actors. They found that even if you look at the shows where they have positive black characters, like Grey’s Anatomy or CSI Miami, there was still evidence for bias. The black actors were treated with more bias in terms of the nonverbal behavior of the other actors on the set than the white actors. It could be how they stood, or how they leaned in or out, or how maybe there’s a slight grimace or a frown.
The second thing that they found was that the viewers of these shows could actually pick up these biases. There’s evidence for this bias contagion. By watching these shows you can become more biased. The more you watch the show, the higher your score on these implicit bias tests. That’s not to say that it’s a bad thing to have strong and positive role models that you can see on these popular shows. It just means that it’s not the only solution.
With all this knowledge about biases, what do we do next?
When people understand that bias can be in all of these places, like the criminal justice system, in our neighborhoods, in our schools, our workplaces, and even hospitals, they tend to get overwhelmed and feel like, “If it’s everywhere, and if it’s unconscious, how could I fight against it.” But really, understanding the situations that make us more or less vulnerable to bias, that’s key. Even though we’re vulnerable, we’re not always acting on it. We don’t have to be held in its grip. Understanding what those situations are that allow us to behave in ways that are free of bias or to make decisions in ways that are free of bias; that’s a good thing. That’s what we would do next.
The harder thing is not just to manage the situations that are going to bring the bias alive or have it have a negative impact, but also to figure out how those associations are created in the first place and to try to address that.
The harder thing is…to figure out how those associations are created in the first place and to try to address that.
One of the associations that is most damaging with African Americans is this link to criminality. It affects people not only in the criminal justice space but also it determines who neighbors are going to see as suspicious, who teachers are going to see as troublemakers, and who is going to look angry at work. We need to try to figure out how to address those associations head on. How do we change the narratives about who certain social groups are and what they’re capable of? It would be good to start those conversations.
In the U.S. especially, helping people to understand that there are lots of reasons for a disparity—that it’s not just, “Those people did something bad and they made bad choices.” Helping people to see that your behavior can be a function of your social environment; that the disparities that we see can be a function of the policies that have been created. Helping people to understand that would go a long way in trying to change the narrative about the negative traits that people believe certain social groups have.
Do you have examples of how people have tried to curb bias?
Let’s take the tech world. I don’t know if your readers would know about Nextdoor. It’s this online platform for bringing neighbors together. And the idea is to try to form communities that are neighborhoods, places that are happier and healthier and so forth.
Nextdoor was built on this model of bringing people together, but the cofounder and others have found that people were using the platform to racially profile. They would look out their window and see a black man in an otherwise white neighborhood, and they would go to their computer start to communicate with their neighbors through the platform about the suspicious man and so forth, and sometimes even call the police. This sometimes would lead to arguments within the neighborhood about profiling.
One of the cofounders, Sarah Leary, reached out to me because she knew that I study bias. She wanted to figure out how to curb racial profiling on the platform. She reached out to others as well, and then also consulted the literature in social psychology and bias to try to get a handle on it. She found that slowing down is really helpful. That people tend to act on biases a lot more if they’re forced to make decisions quickly. She realized that the neighbors, when they saw a suspicious person, felt that they were in this kind of heightened state where they were under threat, and they felt like they had to act quickly to alert everyone to keep safe. So that’s a situation where bias could kick in. [Leary] had to contemplate how to slow people down, which was a big decision for them. Slowing people down is typically not what you want to do in the tech industry, where all the products are built so you can use them quickly, and everything’s fluid and intuitive and you don’t have to think very much. But those are the very sort of conditions under which bias can get activated.
They thought about whether they wanted to slow people down, and for them that would mean adding friction. Normally you’re trying to create this frictionless product, so they worried about that because they thought people would maybe not use the platform as much. They decided, though, that the issue was important enough to them to try to slow people down anyway.
How they slowed them down was fairly simple. It used to be if you saw something suspicious, you would hit a tab called Crime and Safety, and then you would just have at it and alert all the neighbors and say, “Suspicious black man in the neighborhood.” And now, when you hit the Crime and Safety tab, you have to go through a checklist. You have to answer some questions before you start alerting everyone. And the first question is, What is it about this person’s behavior that makes them suspicious? It can’t be black man, it can’t be a social category that makes you suspicious, it has to be something that they’re doing. So it redirects users to identify the behavior, not the race, as suspicious.
“If you see something suspicious say something specific.”
Secondly, they’re asked to describe the person, and here again it can’t be a description that’s purely based on social category, which was typically “black man” when they saw someone suspicious. It had to be more detailed so that they didn’t sweep everybody under the same category, so everybody who was a black man, say, would become suspicious.
The third thing they did was to define what racial profiling was to people. A lot of people didn’t know what the definition was, and then they were also told that profiling is prohibited on the site. So they gave people a strong social norm to say this is not acceptable. This is not how you can behave here on this platform.
You know that sign you see at airports and things where it says, “If you see something, say something”?
Yes.
So they tried to modify it. “If you see something suspicious say something specific.” They found just with adding this checklist that they were able to decrease racial profiling on the platform by over 75 percent.
Wow.
It’s small intervention, but it made a big difference. Another point I should make is that a lot of how we combat bias, as individuals and as institutions, is similar. We can decide to slow ourselves down so that we’re not acting on our biases, but our institutions can do the same. Whether it’s tech companies or police departments or schools. Those institutions have way more power than we do as individuals. I think Nextdoor is in something like 95 percent of the neighborhoods across the U.S. now, so putting up that simple checklist actually is shaping millions of people at once. Not just one person or one life. It can make a big difference.