How to identify, avoid cognitive bias during a crisis
By Julie Wright
We’re in the midst of a public health emergency, a slow-moving economic disaster and a period of major social upheaval. These are ideal conditions for cognitive bias to take root. When we are stressed, our attention is distracted, emotions run high, the dangers of cognitive bias are elevated, and strategic thinking and decision making are often impaired.
It’s important for leaders to recognize their biases and take steps to minimize or eliminate them, individually and across their teams. It’s also important in internal and external communications to recognize that cognitive bias may also be interfering with your ability to communicate with your target audiences.
Here are five ways to mitigate and avoid cognitive bias in times of crisis:
1. Research and test your messages.
At the heart of every good communications strategy and crisis plan is a messaging platform. Key messages are used to help drive the beliefs, motivations and behaviors of an organization’s target audiences and stakeholders.
Messaging can target employees, customers, regulators and communities. In all of these cases, messaging needs to be based on research that gives emotional insights into the target audience. This avoids your own bias clouding your communications and helps you identify your target audience’s cognitive bias, to which you might be able to adapt your messaging.
The framing bias could be used to set context in your messaging. For example, emphasizing a huge potential number upfront for buying a product or describing an infection rate makes the subsequently shared actual price or infection rate seem smaller. The reverse is of course true as well.
Framing is all about how the information is presented to an audience and less about the actual facts. Statistical data is often framed. For instance, rather than saying 80% of dentists choose your product, say four out of five recommend it. Instead of saying your organization reduced GHGs by 50 metric tons, put it in terms people understand, like saying you reduced emissions equal to 11 passenger vehicles.
An additional step is to evaluate the success of your messaging platform through a focus group or other exercise to validate that they’re being received as intended. However, in deploying the framing technique, be careful it does not overly distort the intended message and risk backfiring—a perception of “spin” can erode trust and damage the relationship with your audience.
2. Acknowledge that cognitive bias exists.
Another important step to minimize your cognitive bias is to acknowledge that it exists.
For instance, normalcy bias has compelled many leaders to minimize the threat of the coronavirus with statements like “it’s business as usual” or “it’s important to get students back to the classroom this fall.” Normalcy bias minimizes threat warnings and downplays disaster and its impacts. In a public health emergency, exhibiting normalcy bias in messaging erodes a leader’s ora brand’s credibility and potentially endangers employees, customers or students by undermining safety precautions or prudent planning.
Along the same lines, familiarity bias drives people to categorize new experiences or situations along the lines of the familiar rather than evaluating them more deeply. This is what led some leaders to compare COVID-19 to influenza saying, “it’s no worse or different than the seasonal flu.”
Both of these biases indicate a certain level of denial, which is a common first reaction to terrible news. Avoiding or minimizing biases is critically important during periods of crisis when we are mentally taxed, juggling multiple issues and just plain tired. This is when biases are most likely to color decisions.
3. Equip yourself with tools.
Tools like a crisis plan, evaluation criteria, scoring matrices and even the tried and true checklist can enforce the discipline needed to ensure objective and reasoned decisions and avoid cognitive traps, particularly in a crisis.
Airline pilots and surgeons rely on checklists to ensure bias is kept out of their decision making. In the case of pilots, a heuristic of “aviate, navigate, communicate” is taught early. When an issue occurs in the sky, the pilot knows to focus first on flying the plane, on navigating to safety next and on communications with the tower or other pilots third. (See “A.N.C.—It Matters Now More Than Ever.”)
Having a Disaster Recovery and a Business Continuity Plan is another essential, particularly now. Normalcy bias often delays the development or timely updating of such plans.
4. Surround yourself with multiple viewpoints.
A diversity of insights and information sources helps to reduce bias.
When you are surrounded by people with different life experiences, professional expertise and beliefs or world views, your decision making will be based on more inputs and become more immune to confirmation bias. Confirmation bias is our tendency to cherry pick information or viewpoints that match our own expectations or experiences.
Boardroom diversity is an indicator of higher corporate performance.
A report by McKinsey studied board composition, returns on equity (ROE), and margins on earnings before interest and taxes (EBIT) of 180 publicly traded companies in four countries over two years. McKinsey found “startlingly consistent” results: “For companies ranking in the top quartile of executive-board diversity, ROEs were 53 percent higher, on average, than they were for those in the bottom quartile. At the same time, EBIT margins at the most diverse companies were 14 percent higher, on average, than those of the least diverse companies.”
Sometimes a leader must make a judgment call without the benefit of other viewpoints. In those moments, it’s important not to exhibit overconfidence bias. Overconfidence bias leads to a false sense of skill, talent or self-belief. For leaders, it can be a side-effect to their power and influence. Overconfidence bias shows up in illusions of control, timing optimism, and the desirability effect (i.e. thinking if you desire something, you can make it happen).
5. Learn to spot common cognitive biases.
So far, we’ve discussed normalcy bias, familiarity bias, confirmation bias and overconfidence bias. Some other frequent cognitive biases include:
Anchoring bias. Anchoring refers to using previous information as a reference point for all subsequent information, which can skew a decision-making process. Putting the original full price next to the markdown anchors our original perception of value as being the full price. Against that first piece of information, the sale price looks like a steal. But what if the wholesale cost of the item was first shown? The sale priced wouldn’t look so appealing.
Self-serving bias. Self-serving cognitive bias helps soften the blow to the ego when we make a poor decision by attributing it to poor luck. When things turn out well, though, we attribute it to skill or something else that was directly under our control. The downside to this bias in organizations, teams and leaders is that it does not produce a culture of accountability.
Herd mentality. As social creatures, it is hard to fight herd mentality. When there is consensus or a growing trend or fad, our gut is to move in the same direction as the herd. While this may feel like the path of least resistance or safer, it is a decision behavior based on emotion and not logic.
Loss aversion. This is one of my favorite principles: Avoiding a loss is a greater motivator than gaining a reward. This can lead to missed opportunities driven by risk aversion. You see it on game shows when contestants settle for the cash they’ve earned rather risking it for a much higher reward. Or in organizational cultures where the mentality of “keeping one’s head down” and analyzing things to death before an eventual decision by a committee is the safer route than the perceived riskier route of decisiveness and efficiency.
Reactance bias. While you might think that members of the public who defy face-covering recommendations or requirements are exhibiting overconfidence bias, they are more likely showing reactance bias, which leads to a fear that complying with one request will end in the restriction of future choices or freedoms.
Dunning-Kruger effect. This effect describes poor performers who greatly overestimate their abilities. Put another way, it applies to people who lack the competence to evaluate their own abilities. To overcome the Dunning-Kruger effect, your reports need to recognize their own shortcomings. If you can grow their competence, they will be able to make more realistic self-evaluations.
Narrative fallacy. Like the framing bias, the narrative fallacy bias appeals to our love of a good story. When the story is too good to resist, we get drawn in. Or, when faced with a series of unconnected events, we force them into a cause and effect narrative. It’s something we’ve been doing since before the ancient Greeks explained the sunrise and sunset as the god Helios pulling the sun across the sky in his golden chariot. Fight the urge to impose narratives where no real connection exists and look instead at what the data says.
Hindsight bias. Statements like “I knew it all along” indicate hindsight bias. It’s easy to feel and claim this after the fact, but the danger is that hindsight bias distorts our memories. We were unlikely to have been as confident of the prediction before the event as we appear to be after it. This can lead to overconfidence and a belief that a person can predict the outcomes of future events.
Be mindful. To avoid cognitive bias at decision junctures and particularly in times of crisis, be sure you’re continuing to research and test your communications. Acknowledging that cognitive bias affects us all, using your available tools, engaging diverse viewpoints and information sources, and familiarizing yourself with the different ways our minds try to shortcut our decisions can help ensure a sound strategy and outcome.