Unintentional Prejudice? Understanding (and Changing) Implicit Bias
November 3, 2016
Think by Yudis AsnarPhoto Credit: Yudis Asnar

On the HBO show Insecure, the main character, Issa, is the only black employee at her workplace, a nonprofit organization that supports at-risk youth. When her coworkers want to know what a new slang term means, they turn to Issa—assuming she will know. When Issa suggests a new program to expose the youth to the arts, her coworkers immediately assume that the students will be most interested in going to an African American museum or a hip-hop version of a Shakespeare play.

Issa’s experience with her coworkers is far from unusual: although her coworkers genuinely want to help at-risk youth, they often end up treating others based on stereotypes, perhaps even without being aware that they are doing so. Psychologists have found that many of us have this tendency: even when we want to be unprejudiced, we often have implicit biases that shape our behavior in a variety of ways. In today’s post, I’ll talk about what implicit bias is, what its consequences are, and ways that we can work to reduce implicit bias.

What is implicit bias? Social psychologists have theorized that prejudiced attitudes can fall under two categories: explicit attitudes—the attitudes we verbalize to others—and implicit attitudesattitudes that we may not even be aware that we have. An implicit attitude can be thought of as a split-second, gut-level reaction to something or someone we encounter. Unlike explicit bias, implicit bias is something we are typically unaware of, and it’s possible to hold implicit biases even if someone doesn’t want to be prejudiced. These implicit biases are thought to be learned from society: for example, we might pick up on certain attitudes by seeing how different groups are represented in the media. This is one of the reasons that implicit bias is so hard to change: because these attitudes develop slowly over time, people often don’t know that they have these attitudes and can hold implicit biases even when they are genuinely motivated to be non-prejudiced. As the psychologist and NYU professor John Jost and his colleagues write, “The fact is that many people are sincere in holding egalitarian ideals and yet harbor implicit biases.”

Researchers studying implicit attitudes have found that most people have negative attitudes towards marginalized groups, and that members of marginalized groups may hold negative implicit attitudes towards their own group (although these biases are typically less pronounced than the biases held by people outside of their group). Implicit measures have been used to study a variety of attitudes that people hold—for example, beliefs about racial and ethnic groups, attitudes towards LGBT individuals, beliefs about women’s ability in math, and attitudes towards individuals who are overweight. For the purposes of this article, I will focus primarily on studies that have measured implicit attitudes based on race and ethnicity, but many of the techniques suggested in this article for reducing implicit bias could be used to try to reduce bias towards other groups as well.

How can we measure implicit attitudes? So, how do researchers measure implicit bias if these are attitudes that people are sometimes unaware of? One way that psychologists do this is through the Implicit Association Test, which measures attitudes by looking at people’s reaction times to different words. For example, imagine a researcher is trying to develop an implicit test to determine whether you are a dog person or a cat person. In this study, you would be seated at a computer and different words and images would appear. Some would be pictures of dogs, some would be pictures of cats, some would be positive words, and others would be negative words. In one part of the study, the experimenter would ask you to press the “A” key on your keyboard if you see a picture of a dog or a positive word, and the “B” key if you see a picture of a cat or a negative word. In the second part of the study, the pairings are switched: now, you’re instructed to press “A” if you see a dog or a negative word, and “B” if you see a cat or a positive word. How does this measure attitudes? The IAT assumes that we are faster to make associations that reflect our implicit beliefs. So, if you were faster to respond when “dog” was paired with “positive,” you would be considered more of a dog person, and if you were faster when “cat” was paired with “positive,” you’re more of a cat person. The IAT has been used to measure attitudes in a variety of different domains: for example, it has been used to assess automatic biases related to race, age, gender, and weight. If you’d like to try out the IAT for yourself, you can take a test here: https://implicit.harvard.edu/implicit/takeatest.html.

Why does studying implicit bias matter? Even though people are unaware of the implicit biases they hold, these biases can affect behavior in a variety of ways. In one study, researchers had white participants take the IAT and interact with both a white experimenter and a black experimenter. The researchers found that participants who showed higher levels of bias on the IAT behaved less positively towards the black experimenter than towards the white experimenter (for example, by smiling less when interacting with the black experimenter). In other words, even if people do not want to act in a prejudiced way, unconscious biases can still affect behavior during social interactions.

These types of bias can have consequences outside of the research lab as well. Researchers have found that hiring managers’ implicit biases affect who is called for job interviews. In another study, doctors were given a hypothetical scenario of a patient experiencing chest pain and asked to say whether they would recommend treatment. The researchers found that doctors with higher implicit bias were less likely to recommend treatment for black patients.

Researchers have also conducted studies suggesting that implicit bias may play a role in decisions made by police officers. In one study, police cadets played a video game where they pressed a button to “shoot” if the person depicted in the game was holding a gun, and a different button if the person was holding an innocuous object. In this video game, the participants were instructed to respond quickly, which led them to make errors. The researchers found that participants’ levels of implicit bias affected the types of errors that were made: when the police cadets had higher levels of implicit bias, they were more likely to accidentally shoot the picture of an unarmed black person who appeared on the screen. In other words, even though we don’t intend to hold implicit biases, they can have a variety of real-world consequences—in everyday interactions, in hiring decisions, in medical settings, and in law enforcement.

How can we work to reduce implicit bias? Because we are typically unaware of implicit biases, it can be hard to change them. However, in recent years, psychologists have found that we can indeed change our biases if we are motivated to do so and are given the right set of techniques to use to reduce bias.

In one study, Patricia Devine and her colleagues developed an intervention to reduce implicit bias over the long term. According to Devine, prejudice is essentially a bad habit that people need to learn to break. The first step to changing a habit involves motivation: people need to become aware of their bias and be motivated to change. Consequently, the beginning of Devine’s intervention involves having participants take the IAT, get feedback on their results, and learn about the effects of implicit bias. After this, participants are given a series of five strategies to implement in their daily lives:

  1. Stereotype replacement: Because implicit biases can be hard to recognize, the first step involves becoming aware of the stereotypes people hold. For example, several years ago, the toy company Lego was critiqued for releasing a line of toys that showed women behaving in stereotypically feminine ways (in response, the company developed a new line of toys featuring female scientists). Recognizing stereotypes that we may unconsciously hold then allows us to replace them with other, less biased behaviors.
  2. Counter-stereotypic imaging: This technique involves thinking of people who don’t fit stereotypes that we might hold. For example, the psychologist Mahzarin Banaji, a psychologist who has studied how we can use counter-stereotypic images to reduce bias, describes one such image used in her studies: a picture of a woman who is dressed as a construction worker while also breastfeeding her baby. Banaji describes to NPR why she finds images like this so powerful: “[T]hat image pulls my expectations in so many different directions that it was my feeling that seeing something like that would also allow me in other contexts to perhaps have an open mind.” In other words, seeing images that challenge our expectations may help us to become more open-minded and less biased.
  3. Individuation: Individuation means seeing members of other groups as individuals, rather than judging individuals based on their group membership. A great example of individuation is the photo series Humans of New York: these photographs include a backstory that helps us to see the person photographed as a unique individual.
  4. Perspective taking: Another way to reduce prejudice is to take the perspective of someone from a group different from your own. Recently, a male engineering student wrote an essay describing this type of perspective-taking: he wrote about the obstacles his female classmates faced (such as being told not to study science or being judged based on their gender by teachers). By taking the perspective of someone else, we’re better able to understand and empathize with the obstacles they face, which is important for reducing prejudice.
  5. Contact: Researchers have found that forming connections with people who belong to different groups from us can be a powerful way to reduce prejudice. In one study, for example, white and Latina participants (who were strangers before arriving in the lab) completed a variety of tasks that were designed to increase feelings of friendship. Compared to participants who interacted with someone of their same ethnic group, participants who interacted with someone from a different ethnic group (and who were initially higher in implicit bias) were more likely to initiate social interactions with people from other ethnic groups in the months following the study. In other words, interacting with one person from a different group caused the participants to later try to meet other people who were different from them.

In Devine’s study, participants in the intervention were instructed to put all five of these strategies into practice in their daily lives. Their levels of implicit and explicit bias were then measured over 8 weeks (and were compared to participants in another group who were told their IAT score but not given information about how to reduce bias). The researchers found that the intervention significantly reduced participants’ IAT scores, and that participants also became more concerned about discrimination and aware of their own biases. The researchers also found evidence suggesting that the more we use strategies such as these, the more effective it is: participants who said they were likely to use the strategies showed the greatest reductions in bias. In other words, it appears that we can “break the prejudice habit” and work to reduce biases that we unconsciously hold.

Although many people today genuinely want to be unprejudiced, research suggests that many of us hold implicit biases that we may not even be aware of. These unconscious biases can affect behavior in a variety of domains: they can affect who gets called for a job interview, who receives medical treatment, and even how one is treated by law enforcement officers. However, research has found that, by working to become aware of stereotypes we hold, to see people from other groups as unique individuals, and by taking others’ perspectives, we can indeed work to overcome these biases.

Additional Reading:

Project Implicit. Frequently Asked Questions: https://implicit.harvard.edu/implicit/faqs.html

Project Implicit. About the IAT: https://implicit.harvard.edu/implicit/iatdetails.html

Kirwan Institute for the Study of Race and Ethnicity. Understanding Implicit Bias: http://kirwaninstitute.osu.edu/research/understanding-implicit-bias/

Jost, J. T., Rudman, L. A., Blair, I. V., Carney, D. R., Dasgupta, N., Glaser, J., & Hardin, C. D. (2009). The existence of implicit bias is beyond reasonable doubt: A refutation of ideological and methodological objections and executive summary of ten studies that no manager should ignore. Research in Organizational Behavior, 29, 39-69.

McConnell, A. R., & Leibold, J. M. (2001). Relations among the implicit association test, discriminatory behavior, and explicit measures of racial attitudes. Journal of Experimental Social Psychology, 37, 435–442.

Green, A. R., Carney, D. R., Pallin, D. J., Ngo, L. H., Raymond, K. L., Iezzoni, L. I., et al. (2007). Implicit bias among physicians and its prediction of thrombolysis decisions for Black and White patients. Journal of General Internal Medicine, 22, 1231–1238.

Tuttle, K. M. K. (2009). Implicit racial attitudes and law enforcement shooting decisions. Unpublished manuscript. University of Michigan.

Devine, P. G., Forscher, P. S., Austin, A. J., & Cox, W. T. (2012). Long-term reduction in implicit race bias: A prejudice habit-breaking intervention. Journal of Experimental Social Psychology, 48(6), 1267-1278.

Vedantam, S. (2013, 19 July). How To Fight Racial Bias When It’s Silent And Subtle. National Public Radio: http://www.npr.org/sections/codeswitch/2013/07/19/203306999/How-To-Fight-Racial-Bias-When-Its-Silent-And-Subtle

Page-Gould, E., Mendoza-Denton, R., & Tropp, L. R. (2008). With a little help from my cross- group friend: Reducing anxiety in intergroup contexts through cross-group friendship. Journal of Personality and Social Psychology, 95(5), 1080-1094.

About this Contributor: Elizabeth Hopper is a PhD candidate in Social Psychology at the University of California, Santa Barbara.  Prior to attending UCSB, she received her BA in Psychology and Peace & Conflict Studies from UC Berkeley and worked in a research lab at UC San Francisco studying health psychology.  Her research interests include positive emotions, close relationships, coping, and health.  Outside of the research lab, Elizabeth can often be found going to yoga class, teaching her puppy new tricks, and working on her creative writing.

Leave a reply

Your email address will not be published. Required fields are marked *