Understanding and Addressing Implicit Bias
At times, everyone makes choices, often unconsciously, that are completely contrary to their beliefs. It’s called implicit bias, and it can happen all too easily unless people take steps to prevent it.
People process information quickly based on past associations, which come to mind involuntarily and play a key role in shaping thoughts and actions, said Keith Payne, professor of psychology at Carolina.
“Associations that come to mind don’t care if we like them or believe them,” he said. “We really don’t have to believe the associations are true to have them come to mind. In fact, if we fully understood the influences on – and causes of – our decisions, we would probably reject them.”
Payne was the keynote speaker for the Aug. 13 THINKposium, a free exchange of ideas on a particular topic. This year’s daylong event, held at the Stone Center, focused on implicit bias and its effect on classroom instruction and hiring practices. It was designed to help faculty and staff understand the concepts and behaviors associated with unconscious bias and develop ways to address the issue within their own areas.
Chancellor Carol L. Folt, who welcomed the more than 100 THINKposium participants, said the work they were about to undertake was vitally important.
“The UNC-Chapel Hill we want is the one you’re thinking about creating. It has the same level of intentionality we bring to our teaching and our research and the way we build community,” she said.
That requires much more than good intentions, she explained. “We’re surrounded by people who want to do the right thing, but it doesn’t just happen. It only happens when we put that intentionality into our work.”
Intentionality is key, Payne said.
Because the mind thinks in terms of established categories, it is virtually impossible to be completely open-minded, he explained. Furthermore, the processes that drive people’s conscious decisions and those that drive their reflexive behaviors do not necessarily match. That dichotomy explains why people tend to respond to surveys based on their values, but their actions can be very different – particularly when it comes to prejudice.
Some of the biggest risk factors in implicit bias involve ambiguity, where the criteria are not clear or are not conclusively positive or negative; rushing or being distracted when making decisions; and poor feedback about those decisions.
To overcome implicit bias, Payne offered some tips:
- Structure the decision-making approach by articulating relevant criteria ahead of time;
- Review or grade articles and papers without looking at the author’s personal information;
- Frame hiring questions in terms of “all who would be well suited” rather than “the single ideal person”; and
- Don’t be distracted. As a rule of thumb: if you wouldn’t drive, then don’t decide.
Taffye Benson Clayton, associate vice chancellor for Diversity and Multicultural Affairs, said she was glad “this hybrid think tank,” as she described THINKposium, delved into understanding implicit bias, which underlies patterns of behavior that affect issues related to diversity and inclusion.
Presenters Mariska Leunissen, Susan Coppola and Ted Shaw presented TED-style talks during the lunchtime session, and David Kiel and Patricia Parker facilitated afternoon small-group sessions. For more about the full program and speakers, click here.
THINKposium was hosted by Diversity and Multicultural Affairs, the College of Arts and Sciences and the Center for Faculty Excellence, and co-sponsored by the Institute for the Arts and Humanities, Office of Faculty Governance and Employee Forum.