The Neural Basis of Implicit Bias — 5/9/17

Pratik Sachdeva
nbycreads
Published in
3 min readMay 3, 2017

Humans are categorizing machines. Try this exercise: the next time you see a person on the street, try and guess aspects of their personality or background (gut reactions only — don’t think too hard about it).

Maybe you saw an Asian male walking on Berkeley’s campus and inferred that he’s a math major. Naturally, we’d say that your inference was rooted in the stereotype that Asians are good at math. You might defend yourself — “He was wearing a Berkeley Mathematics shirt!” — but who’s to say it was his shirt? Maybe he borrowed it from his girlfriend (or boyfriend?), and they’re the actual math major. Why did you assume he’s male, or Asian, in the first place? Didn’t you infer that about him (her?) too?

We rely on making subconscious categorizations like these all the time. It’s efficient, and we’re really good at it. Our brain seems to be fine-tuned to automatically search for associations (and thus impose categorizations) by utilizing perception and prior knowledge. So, in the above example, you perceived the person on Berkeley’s campus to be wearing a Berkeley mathematics shirt. Prior knowledge tells us that most people wear their own clothes and furthermore, their clothes are indicative of what they identify with. Thus, the person had to be a Berkeley math student. The key point here is that you didn’t really think about what was perceived or prior knowledge — you just knew.

But what happens when our prior knowledge is incomplete, or biased, or flat out incorrect? This influences our subconscious categorizations and therefore decision-making, which can affect our interactions with other people. For example,

  • An implicit association of women with administrative work might lead a manager to delegate secretarial work to a woman during a group meeting;
  • An implicit association of Asians with unathletic might influence how a group of people divide up teams for a pick-up game;
  • Or an implicit association of black person with criminal might lead a police officer to assume that a black person is reaching in their pocket for a gun (rather than, say, a cell phone), leading the officer to fire at them.

None of the people in these scenarios made their decisions in an effort to actively discriminate — but their implicit associations led to actions that had similar effects. Reality is hazy and complicated, while decisions are binary. Any biases in our prior knowledge can flip a decision one way or the other.

In these readings, we’ll read about work in neuroscience that aims to identify where implicit attitudes arise in the brain and how they affect our decision making.

All Readings

(1) First, take the Implicit Association Test (IAT) — I recommend the Race IAT. Then, read an introduction to implicit bias and the IAT in Section 1 of the linked Stanford Encyclopedia of Philosophy article.

(2) Warm-up: read the following Mother Jones article on implicit bias (disregard the terrible title).

(3) “The Neural Basis of Implicit Attitudes” by Stanley et al., the main reading for this week.

(4) Lastly, a Nature Reviews Neuroscience article on “The neuroscience of prejudice and stereotyping” by Amodio. There’s a bit of overlap between this paper and the previous one, so I recommend reading the figures and boxes.

--

--