Contributor: Mushfiqa Jamaluddin

Ask any futurist and they’ll agree that to do our work well, we need to be aware of our biases to ensure we are not inadvertently imposing our values on our clients. In every class in my Masters program the topic of bias has been mentioned. We’ve regularly discussed the need to reflect on our own biases and work in groups to ensure a diverse range of perspectives. I have been surprised, then, to find that the foresight projects I’ve worked on lacked specific practices aimed at addressing our biases.

I have only worked on a few projects so I can’t claim this is the case across the field. Nevertheless, I’ve found myself returning to this gap between what the field agrees is an important element of good foresight and the reality of how the work is done. I got curious about why I haven’t seen practices to identify and mitigate bias as a standard part of the foresight process.

The Cognitive Bias Codex lists 188 biases in a beautiful visual, including links to descriptions of each, and organized into themes. Miguel Jimenez highlights 5 biases he deems most important for futurists, Rebecca Ryan shares a table of 16 biases that impact creativity and the innovation process, Megan Crawford explores the effects of bias on the scenario planning process, and Lieve Van Woensel offers the bias wheel, a tool to help policy makers become aware of the most relevant biases in her book, A Bias Radar for Responsible Policy-Making.

Crawford does offer specific techniques that facilitators could use to reduce the most likely biases at each stage in the scenario planning process, including specific language and questions. The remaining pieces suggest that being aware of our biases and working in diverse groups will take care of most of the heavy lifting.

With potentially 188 different cognitive biases to contend with, the deceptively simple recommendation to “be aware of your biases” can feel overwhelming. If our conscious minds can only deal with three or four things at a time, how can we keep all of our biases top of mind while trying to do, well, really anything else? Isn’t that the purpose of bias as a cognitive mechanism in the first place – to help us filter and make sense of the overwhelming amount of information coming at us every day? To address our biases then, we need to embody the attitudes that run counter to the biases we are trying to change, rather than relying on our minds to continuously hold them in conscious awareness.

The inclination to talk about biases in generalities rather than engaging with each one at a time seems to be part of the issue. In this piece, I focus on the subset of cognitive biases known as implicit bias, and specifically on racial biases against Black people in the US. Implicit biases are particularly challenging to change. If we identify effective interventions for this subset of biases, perhaps best practices learned there could also be useful for other cognitive biases.

Calvin Lai ran a research contest in 2015 to test which interventions were most effective when it came to reducing implicit racial bias against Black people. He found that vividly imagining a scenario where a Black person rescued the person holding the bias from a distressing situation was the most effective in reducing the implicit bias. The interventions of creating positive associations (e.g. pairing an image of a Black person with a positive word) and setting a goal to override a bias when it’s activated were the other two most effective strategies, out of 18 total interventions studied. But even the effects of top three strategies didn’t last very long.

I suspect that part of the reason we acknowledge the need to be aware of our biases yet fail to take focused action is because we don’t know how to approach it. It doesn’t help that one of the most respected researchers in the field admits that we have not yet landed on any proven interventions that create durable effects. But that doesn’t mean we can’t do anything. As a practitioner who is in the business of helping organizations shape the futures that we’re all going to live in, I feel it’s my responsibility to keep experimenting with ways that we can meaningfully and sustainably mitigate our unconscious biases – individually and collectively.

To that end, psychologist and neuroscientist, Keith Payne, suggests that our implicit biases may be better understood as a cultural phenomenon than a fixed attitude held by a single individual. Payne shares the following:

“From the Bias of Crowds perspective, interventions that attempt to change implicit attitudes are likely to have limited success because most of the force of implicit bias comes from situations and environments rather than individual attitudes…. To the extent that implicit bias is grounded in the culture, community, and immediate social contexts people inhabit, then solutions need to focus on structuring the social context rather than changing the beliefs or values of individuals.”

In other words, our minds are like mirrors. We reflect the world around us and gravitate towards the beliefs embedded in our social contexts. It means interventions and practices aimed at addressing implicit bias need to include consideration of the environments. We wouldn’t heat a single droplet of water, drop it into a cup of cold water, and expect it to maintain its temperature. Similarly, a multi-level issue cannot be adequately address with individual-level interventions.

If we combine the Bias of the Crowds with Lai’s findings about the most effective interventions at the individual level, could engaging in something like a regular group practice of vividly imagining scenarios that challenge specific, structurally held implicit biases lead to lasting changes? That sounds like an intervention tailor-made for a community of futurists to experiment with.

Building a futurists’ community of practice around implicit bias exists as the seed of an idea, but if any of this resonates with you, let’s talk!