I conducted this interview with Rachel Godsil, director of research at the American Values Institute, about how implicit bias not only affects individuals but society as a whole. The American Values Institute, an Open Society Foundations grantee, is a consortium of researchers from universities across the country and social justice advocates from a wide range of groups and perspectives.
What is implicit bias?
Implicit bias occurs when someone consciously rejects stereotypes and supports anti-discrimination efforts but also holds negative associations in his/her mind unconsciously. Scientists have learned that we only have conscious access to 5 percent of our brains—much of the work our brain does occurs on the unconscious level. Thus, implicit bias does not mean that people are hiding their racial prejudices. They literally do not know they have them. More than 85 percent of all Americans consider themselves to be unprejudiced. Yet researchers have concluded that the majority of people in the United States hold some degree of implicit racial bias.
How does implicit bias manifest itself in our daily lives?
The areas researchers have studied show that implicit bias can affect people’s decisions and their behavior toward people of other races. For example, a doctor with implicit racial bias will be less likely to recommend black patients to specialists or may recommend surgery rather than a less invasive treatment. Managers will be less likely to invite a black candidate in for a job interview or to provide a positive performance evaluation. Judges have been found to grant dark-skinned defendants sentences up to 8 months longer for identical offenses.
Implicit bias also affects how people act with people of another race. In spite of their conscious feelings, white people with high levels of implicit racial bias show less warmth and welcoming behavior toward black people. They will sit further away, and their facial expressions will be cold and withdrawn.
These same implicitly biased white people are also are more apt to view black people as angry or threatening and to predict that a black partner would perform poorly on a joint academic task. White people with stronger implicit bias against black people actually do perform poorly on a difficult task after interacting with a black person—suggesting that, without knowing it, they were challenged mentally by the effort of appearing non-biased.
Do these research findings differ from previous studies about racial bias? What were some of your most surprising findings?
Much of this research is surprising to those working for racial justice. To begin with the positive: White people appear to want to be fair and non-discriminatory when they are aware that they may be influenced by race. The study involving doctors showed this clearly; when the doctors were told that race had been shown to influence treatment decisions, all signs of racially different treatment disappeared. Jurors, too, wanted to be fair. In a jury study, four sets of jurors were asked to recommend conviction and sentencing for an assault charge:
- In the first scenario, a black man hits his white girlfriend in a bar.
- In the second, a white man hits his black girlfriend in a bar.
- In the third, the black man says, “How dare you laugh at a black man in public,” before he hits his girlfriend.
- And in the fourth, the white man says: “How dare you laugh at a white man in public.”
White jurors recommended higher sentences for the black man than the white man in the first scenario, but not the fourth. In the fourth, race was an explicit issue, and the White jurors clearly wanted to be fair. In the first, it was more subtle, so their implicit biases affected their decision-making.
Our challenges: the levels of implicit bias are very high, and the research is far more developed in measuring bias than effectively changing it. We know that people are less implicitly biased if they are exposed to “counter-stereotypical” individuals, but most white people lead very segregated lives.
How does implicit bias tie into Claude Steele’s idea of stereotype threat?
Stereotype threat refers to a person’s anxiety or fear that their performance on a difficult task will confirm a negative stereotype about their group. Claude Steele was able to illustrate this phenomena beginning in 1995 by having white and black undergraduates take a difficult verbal test. One group was told that this test was a measure of their verbal ability, while the other was told that the goal of the study was to learn how people experienced test-taking and that their score was not relevant. The students in both groups took the same difficult test, but there was a wide racial disparity in the performance of white and black students when they thought the test was “diagnostic” of their intelligence.
The students’ scores were almost identical when they thought their score was not being measured. Hundreds of other studies have been done to confirm this finding, and it applies to all sorts of groups depending on the context. Implicit bias and stereotype threat are linked because both are a result of the strength of negative stereotypes about race and gender within our culture. And both occur without the individual knowing about them.
How can those working in the field of social justice use these research findings to structure their messaging?
The most important lesson is that if our messages accuse people of being racist, they will do more harm than good to our work. Because the vast majority of people consider racism to be immoral they will be highly resistant to any message that suggests that they or people like them are racist or biased. Some white people will experience guilt when confronted with a message suggesting that they are racist, but this group is a small minority who are likely to be our allies already. We need to appeal to people’s best selves, to encourage them to act on their conscious egalitarian values, and to create a broader coalition for social justice work.