Understanding your biases

Understanding your biases

Two WashU researchers who conduct studies on bias and its impacts, Calvin Lai and Clara Wilkins, explain the roots and consequences of bias and how we can potentially reduce it.

If there is one thing you need to know about biases, it is that you have them.

When we see the word “bias” in the news, it is usually in connection with a terrible injustice, like someone being passed over for a job, or worse, targeted by law enforcement because of their gender, race, or nationality. We tend to think of people who behave in biased ways as bad people who take extreme actions to exclude others. No one wants to admit to being biased.

According to researchers in psychological and brain sciences, however, biases are often at least partly unconscious. Despite this, they profoundly impact way we interact with the world and tend to perpetuate much of the inequality that exists in our society.

The basics: What is bias?

If we want to decrease harmful biases, we need to first understand what bias is. Clara Wilkins, assistant professor of psychological and brain sciences, says that when she teaches bias in the classroom, she breaks it down into three components that are often referred to as the “ABCs” of bias. The “A,” or affective component, is what we would call prejudice, or negative feelings toward a person that are based on his or her group membership, the “C” or cognitive component is stereotypes, or generalizations about a group, and the “B,” or behavioral component, is discrimination, or the actual actions taken against a person based on their group membership. Wilkins’ Social Perception and Intergroup Attitudes Lab is interested in studying all of these components of bias.

Calvin Lai, assistant professor of psychological and brain sciences, says that although the bias we hear about in the news is usually harmful, bias itself is not always negative. He says that, “the way that psychological scientists define bias is just a tendency to respond one way compared to another when making some kind of a life choice.” Sometimes these biases can be completely neutral, like a bias for Coke over Pepsi, and can even be helpful in allowing you to make decisions more rapidly.

Calvin Lai in the classroom
Calvin Lai

Not all biases are so harmless, however. As Lai notes, “Bias can often lead us in directions that we don’t expect, that we don’t intend, and that we might even disagree with if we knew that it was nudging us in a particular way.” These are the kinds of biases that can be harmful when people allow them to impact their behavior toward certain groups, and the biases that his Diversity Science Lab is attempting to redirect through their research.

Wilkins states that most people are hesitant to see themselves as participating in bias, but that we need to be aware that we can behave in harmful ways, even if we consciously support equality. She says, “Good people also exhibit bias. I think if we have this image of a racist person as a member of the KKK who does something really really violent, that is going to exclude a lot of acts that actually reinforce social inequality.” Understanding that even “good” people can be biased allows us to be more open to exploring our own biases and take actions to prevent acting on them.

“Bias can often lead us in directions that we don’t expect, that we don’t intend."

Studying unconscious biases

Because so many people are reluctant to admit, and are often even unaware of, their biases, it is difficult for researchers to learn what biases the participants they are studying hold. To counter this problem, researchers have developed something called the Implicit Association Test.

Harvard first developed the Implicit Association Test in 1998 to test peoples’ implicit biases by looking at how strongly they associate different concepts with different groups of people. Lai is the Director of Research at Project Implicit, a non-profit that uses online IATs both to collect research data and to help inform the general public about biases, and says that the vast amount of data collected through these tests over the last two decades has allowed researchers to track biases and see how certain demographic factors, including a person’s location, age, and race, can impact their biases.

IATs have consistently shown that people are faster to associate white people with good things and black people with bad things than vice versa, which demonstrates how these pairings are subconsciously linked together in their memories. Researchers have also developed different IATs to test for the associations participants make on the basis of gender, religion, weight, sexuality, age, and a host of other identity categories. If you want to see what one of these tests is like, you can take one yourself at Project Implicit.

Consequences of bias

IATs have a lot to tell us about the possible prevalence and consequences of bias. Lai states that there is a correlation between how people perform on IATs and the way they behave toward different groups. He states, “We do find that these implicit biases do correlate with how people act, and they often do so over and above what people can report, what they can actually say about themselves.” He has worked with a number of different populations to help them understand their biases better, and how those biases can lead to certain groups being treated differently in healthcare, academia, and the job market.

Biases have the potential to do the most harm when they are acted on by people in positions of relative power, whether they be healthcare professionals, employers, or law enforcement officers. We have all heard in the news of how bias can even lead to deadly encounters in situations in which people have to make snap judgments about the risk a person poses. In one of his current projects, Lai is working in collaboration with the Anti-Defamation League to help police officers better understand their biases. His research will help determine if an educational workshop on biases can impact the way law enforcement interacts with different populations.

Clara Wilkins in the classroom
Clara Wilkins

Wilkins also studies how bias manifests in groups with power differentials. Wilkins has found that groups that believe that the current hierarchy is fair tend to double-down on these beliefs and behave in a more discriminatory way when they feel that this hierarchy is being threatened. In a recent study, Wilkins found that men who held status-legitimizing beliefs were more likely to penalize women when reviewing job resumes after being exposed to an article about men being discriminated against. This finding is particularly troubling because she has also found evidence that men have been perceiving men to be the victims of discrimination more often in recent years, which means that these reactionary behaviors against women might also be increasing.

"It is sort of ironic, your idea about fairness actually leads you to behave in an unfair way."

Wilkins explains status-legitimizing beliefs by saying, “Society is structured where some groups have better access to resources than others, so they have more income, wealth, power, etc. than other groups. Status-legitimizing ideologies are ideologies that make that inequality seem fair and legitimate.” She used the idea of meritocracy as an example of a status-legitimizing belief, saying that if people believe that hard work always results in success, they will be likely to see people who are struggling financially as simply not working hard enough and overlook structural inequalities in things like access to education and healthcare. She says, “It is sort of ironic, your idea about fairness actually leads you to behave in an unfair way.”

Wilkins says that opposition to affirmative action is an example of the way status-legitimizing beliefs can make it difficult for people to acknowledge structural inequalities like the ones that were illuminated with the recent admissions scandal involving wealthy parents. “There are a lot of people who are opposed to affirmative action because they think it disadvantages people who are not racial minorities, when there are other structural things like donations or legacy admissions or other things that aren’t based just on merit that disadvantage particular groups.”

Wilkins says that some of the backlash we witnessed after Obama’s presidency is a result of perceived threats to the status-quo that causes groups with power, like white men, to behave negatively toward groups they see as potentially disrupting traditional power dynamics. This reaction might be caused by zero-sum beliefs that see increases in rights/advantages for one group as automatically decreasing advantages for another. These beliefs are often so deeply held that people might not even consciously recognize that they have them, but the can significantly impact behavior.

Avoiding biased actions

So how do we avoid being biased? When it comes to changing your implicit unconscious biases, like the ones the IAT tests for, research has consistently shown that it is more difficult than you would think. Lai says, “It does seem that trying to change implicit bias right now directly, trying to reduce these associations that swirl around in our head, that seems really hard. They are built up over a lifetime of experience and it seems that if you can change them, it requires a lot of sustained effort.” In other words, a quick diversity training, while potentially helpful in getting people to start thinking about their biases, is not going to immediately change the way their brains associate white people with good things and black people with negative things.

Wilkins similarly says that she does not believe that progress toward a less biased world is linear. As her research has shown, the societal changes that we might see as progress are often accompanied by backlash when threats to the established order cause people to double down on their biases, whether consciously or unconsciously.

In spite of these somewhat bleak findings, however, both Lai and Wilkins are optimistic that there are things that we can do to reduce biased actions, even if we can’t completely eliminate biased thoughts. Recently, Wilkins has been researching ways to reduce people’s zero-sum beliefs. She is currently working on a study in which she has Christians read a Bible verse that promotes tolerance to illustrate that acceptance of different groups, like LGBTQ individuals, is not incompatible with Christian values. So far she has found that exposing participants to this verse decreases zero-sum beliefs and increases tolerance. Although she does not yet know how permanent these changes will prove to be, she is taking it as a hopeful sign.

For readers who want to behave with less bias, Lai and Wilkins both say that being aware of your bias is the first step. Lai argued for a systematic approach to tracking our own biases: “I think the big thing is, we’re all susceptible to bias. It’s really important to just keep records, track, in your own life where it might be happening, and then design good procedures or habits so you don’t act on them.” He says that there are simple things that people can do to avoid letting their biases influence decisions, like blanking out names on resumes so that an applicant’s gender or racial identity can’t influence hiring decisions.

Wilkins also says that we should be more aware of why we are drawn to certain people over others, and to go out of our way to avoid acting on these biases. “None of it is an easy solution,” she says, “I think it also requires a lot of motivation… It is not enough to not be racist or sexist. There need to be anti-racist and anti-sexist efforts because these behaviors are so entrenched in our society that it will be difficult to make real, sustained progress.”