DECISION MAKING & 5 tips for identifying—and avoiding—cognitive bias during a crisis. (And discover more ideas in the #GrowingThePostPandemicChurch book.)

Commentary by Dr. Whitesel. While writing my 14th book, “Growing the Post-pandemic Church” I sought to help church leaders discover their unconscious biases that impact their decision-making. Here is a helpful article that defines the major cognitive biases and how you can prevent them from distorting your decisions.

5 tips for identifying—and avoiding—cognitive bias during a crisis: When facing the unknown, you might not even know everything you think you know… by Julie Wright, Public Relations Daily, Sept. 8, 2020.

It’s important for leaders to recognize their biases and take steps to minimize or eliminate them, individually and across their teams…

2. Acknowledge that cognitive bias exists.

Another important step to minimize your cognitive bias is to acknowledge that it exists.

For instance, normalcy bias has compelled many leaders to minimize the threat of the coronavirus with statements like “it’s business as usual” or “it’s important to get students back to the classroom this fall.” Normalcy bias minimizes threat warnings and downplays disaster and its impacts…

Along the same lines, familiarity bias drives people to categorize new experiences or situations along the lines of the familiar rather than evaluating them more deeply. This is what led some leaders to compare COVID-19 to influenza saying, “it’s no worse or different than the seasonal flu.”

Both of these biases indicate a certain level of denial, which is a common first reaction to terrible news.  Avoiding or minimizing biases is critically important during periods of crisis when we are mentally taxed, juggling multiple issues and just plain tired. This is when biases are most likely to color decisions.

3. Equip yourself with tools.

Tools like a crisis plan, evaluation criteria, scoring matrices and even the tried and true checklist can enforce the discipline needed to ensure objective and reasoned decisions and avoid cognitive traps, particularly in a crisis.

Airline pilots and surgeons rely on checklists to ensure bias is kept out of their decision making….

4. Surround yourself with multiple viewpoints.

A diversity of insights and information sources helps to reduce bias.

When you are surrounded by people with different life experiences, professional expertise and beliefs or world views, your decision making will be based on more inputs and become more immune to confirmation bias. Confirmation bias is our tendency to cherry pick information or viewpoints that match our own expectations or experiences…

Sometimes a leader must make a judgment call without the benefit of other viewpoints. In those moments, it’s important not to exhibit overconfidence bias. Overconfidence bias leads to a false sense of skill, talent or self-belief. For leaders, it can be a side-effect to their power and influence. Overconfidence bias shows up in illusions of control, timing optimism, and the desirability effect (i.e. thinking if you desire something, you can make it happen).

5. Learn to spot common cognitive biases…

Anchoring bias. Anchoring refers to using previous information as a reference point for all subsequent information, which can skew a decision-making process. Putting the original full price next to the markdown anchors our original perception of value as being the full price. Against that first piece of information, the sale price looks like a steal. But what if the wholesale cost of the item was first shown? The sale priced wouldn’t look so appealing.

Self-serving bias. Self-serving cognitive bias helps soften the blow to the ego when we make a poor decision by attributing it to poor luck. When things turn out well, though, we attribute it to skill or something else that was directly under our control. The downside to this bias in organizations, teams and leaders is that it does not produce a culture of accountability.

Herd mentality. As social creatures, it is hard to fight herd mentality. When there is consensus or a growing trend or fad, our gut is to move in the same direction as the herd. While this may feel like the path of least resistance or safer, it is a decision behavior based on emotion and not logic.

Loss aversionThis is one of my favorite principles: Avoiding a loss is a greater motivator than gaining a reward. This can lead to missed opportunities driven by risk aversion. You see it on game shows when contestants settle for the cash they’ve earned rather risking it for a much higher reward. Or in organizational cultures where the mentality of “keeping one’s head down” and analyzing things to death before an eventual decision by a committee is the safer route than the perceived riskier route of decisiveness and efficiency.

Reactance bias. While you might think that members of the public who defy face-covering recommendations or requirements are exhibiting overconfidence bias, they are more likely showing reactance bias, which leads to a fear that complying with one request will end in the restriction of future choices or freedoms.

Dunning-Kruger effect. This effect describes poor performers who greatly overestimate their abilities. Put another way, it applies to people who lack the competence to evaluate their own abilities. To overcome the Dunning-Kruger effect, your reports need to recognize their own shortcomings. If you can grow their competence, they will be able to make more realistic self-evaluations.

Narrative fallacy. Like the framing bias, the narrative fallacy bias appeals to our love of a good story. When the story is too good to resist, we get drawn in. Or, when faced with a series of unconnected events, we force them into a cause and effect narrative. It’s something we’ve been doing since before the ancient Greeks explained the sunrise and sunset as the god Helios pulling the sun across the sky in his golden chariot. Fight the urge to impose narratives where no real connection exists and look instead at what the data says.

Hindsight bias. Statements like “I knew it all along” indicate hindsight bias. It’s easy to feel and claim this after the fact, but the danger is that hindsight bias distorts our memories. We were unlikely to have been as confident of the prediction before the event as we appear to be after it. This can lead to overconfidence and a belief that a person can predict the outcomes of future events.

Read more at …

PLANNING & How to Get People to Accept a Tough Decision. #HarvardBusinessReview

by David Maxfield, HBR, 4/19/18.

Every leader has to make tough decisions that have consequences for their organizations, their reputation, and their career. The first step to making these decisions is understanding what makes them so hard. Alexander George, who studied presidential decision-making, pointed to two features:

  • Uncertainty: Presidents never have the time or resources to fully understand all of the implications their decisions will have.
  • “Value Complexity”: This is George’s term to explain that even the “best” decisions will harm some people and undermine values leaders would prefer to support.

The decisions that senior leaders, middle managers, frontline employees, and parents have to make often have the same features. Uncertainty and value complexity cause us to dither, delay, and defer, when we need to act.

What steps can leaders take to deal with these factors when making decisions?

Overcoming Uncertainty

Our initial reactions to uncertainty often get us deeper into trouble. Watch out for the following four pitfalls.

  • Avoidance. It often feels like problems sneak up on us when, in reality, we’ve failed to recognize the emerging issue. Instead of dealing with problems when they begin to simmer, we avoid them — and even dismiss them — until they are at a full boil. For example, perhaps your plants have been running at near capacity for a while and there have been occasional hiccups in your supply chain. Instead of addressing these issues, you accept them as normal. Then, “suddenly,” you’re unable to fill orders.
  • Fixation. When a problem presents itself, adrenaline floods our body and we often fixate on the immediate threat. In this fight or flight mode, we’re not able to think strategically. But focusing exclusively on the obvious short-term threat often means you miss the broader context and longer-term ramifications.
  • Over-simplification. The fight-or-flight instinct also causes us to oversimplify the situation. We divide the world into “friends” and “foes” and see our options as “win” or “lose” or “option A” or “option B.” Making a successful decision often requires transcending simplifications and discovering new ways to solve the problem.
  • Isolation. At first, we may think that, if we contain the problem, it’ll be easier to solve. For example, it may feel safer to hide the problem from your boss, peers, and customers while you figure out what to do. But as a result, you may wait too long before sounding the alarm. And, by then, you’re in too deep.

To avoid these pitfalls — or to get out of them once you’ve fallen into them — it’s best to take incremental steps forward without committing to a decision too quickly. Below are five things you can do to reduce uncertainty as you evaluate your options.

Read more at …

COLLABORATIVE LEADERSHIP & You’ll Never Get a Group to Agree on a Decision. Here’s What to Do Instead

by Chris McGoff, Inc. Magazine, 6/20/17.

… Trying to get everybody to agree on something gives way too much power to the 16 percent of the people who are ninjas at disrupting agreement to draw attention to themselves. These ninjas are known as laggards, according to The Innovation Adoption Curve.

When you ask the group to come to consensus on something, you empower the laggards. They use a variety of tools like “we tried that before,” or they inject information into the process at the worst possible time. You know who they are. They suck the life out of possibility for sport.

No matter how many of their questions you answer they always have more questions. Every time you get close to a decision, laggards bring up a new argument that will make the group hesitate. The way to avoid this pitfall is to rethink the traditional definition of consensus and start using a working definition of consensus.

Next time you have a meeting and need to make a decision, write the following three statements in a prominent place before the meeting begins. Let everyone know that a decision will be made according to the following working definition of consensus:

  1. The process we use will be explicit, rational, and fair.
  2. Each participant will be treated honorably as we go through the process.
  3. We can all “live with and commit to” the outcome.

Let’s dive into what each of those three statements means …

Read more at …

DECISION MAKING & Stanford Prof. Explains Research for Better Decision Making #Video

Commentary by Dr. Whitesel: The past 15 years have witnessed a great deal of research on how to make better decisions. Probably no one understands this more than Stanford Univ. professor Dr. Baba Shiv. Here is a video giving an overview of his research with lessons every leader needs for better decision making.

DECISION MAKING & 20 cognitive biases that mess up your decisions

by Samantha Lee and Shana Lebowitz,Business Insider Magazine, Aug. 26, 2015.

bi_graphics_20-cognitive-biases-that-screw-up-your-decisionsRead more at …

DECISION-MAKING & A Process for Human-Algorithm Decision Making #HarvardBusinessReview

Commentary by Dr. Whitesel: “It may be a while before algorithm decision-making comes to typically cash strapped churches, but it’s important that church leaders understand how computers can help us make better strategic and tactical decisions in the future. Read this article from the Harvard Business Review for an important introduction to human-algorithm decision-making.”



A Process for Human-Algorithm Decision Making


September 18, 2014

Think for a moment about how an organization makes a decision. First come the facts, the data that will inform the decision. Using these facts, someone formulates alternative courses of action and evaluates them according to agreed-on criteria. The decision maker then chooses the best alternative, and the organization commits itself to action.

Advanced analytics can automate parts of this sequence; it offers the prospect of faster, better-informed decisions and substantially lower costs. But unless you’re prepared to transform how people work together throughout the decision-making process, you’re likely to be disappointed.

Take a simple example: a company’s collections function. In years past, dozens of collection agents would receive hundreds of randomly allocated delinquent accounts every day, each one with a few facts about the customer. Each agent then reviewed a standard list of alternatives and decided how he or she would try to collect what was owed.

Today, an algorithm can assemble many more facts about the accounts than any human being could easily process: lengthy payment histories, extensive demographic data, and so on. Using these facts, it can separate the accounts into simple categories, say red-yellow-green.

Now the alternative courses of action are simpler. Red ones — low value, unlikely to pay— go straight to a collection agency. Green ones — high value, likely to pay — go to specially trained callers for white-glove service. The yellow ones require a careful review of alternatives and much more human intervention before a decision is reached.

Within the yellow and green categories, sophisticated test-and-learn experiments can inform the decisions that remain. Agents can discover from these experiments which channels and messages generate the greatest financial return while minimizing costs and customer dissatisfaction. They can thus optimize their choices about how to pursue delinquent accounts.

The new way of doing things is better and more efficient. But look at how it changes the process itself — and what’s expected of the people involved…

Read more at …