DECISION MAKING & 5 tips for identifying—and avoiding—cognitive bias during a crisis. (And discover more ideas in the #GrowingThePostPandemicChurch book.)

Commentary by Dr. Whitesel. While writing my 14th book, “Growing the Post-pandemic Church” I sought to help church leaders discover their unconscious biases that impact their decision-making. Here is a helpful article that defines the major cognitive biases and how you can prevent them from distorting your decisions.

5 tips for identifying—and avoiding—cognitive bias during a crisis: When facing the unknown, you might not even know everything you think you know… by Julie Wright, Public Relations Daily, Sept. 8, 2020.

It’s important for leaders to recognize their biases and take steps to minimize or eliminate them, individually and across their teams…

2. Acknowledge that cognitive bias exists.

Another important step to minimize your cognitive bias is to acknowledge that it exists.

For instance, normalcy bias has compelled many leaders to minimize the threat of the coronavirus with statements like “it’s business as usual” or “it’s important to get students back to the classroom this fall.” Normalcy bias minimizes threat warnings and downplays disaster and its impacts…

Along the same lines, familiarity bias drives people to categorize new experiences or situations along the lines of the familiar rather than evaluating them more deeply. This is what led some leaders to compare COVID-19 to influenza saying, “it’s no worse or different than the seasonal flu.”

Both of these biases indicate a certain level of denial, which is a common first reaction to terrible news.  Avoiding or minimizing biases is critically important during periods of crisis when we are mentally taxed, juggling multiple issues and just plain tired. This is when biases are most likely to color decisions.

3. Equip yourself with tools.

Tools like a crisis plan, evaluation criteria, scoring matrices and even the tried and true checklist can enforce the discipline needed to ensure objective and reasoned decisions and avoid cognitive traps, particularly in a crisis.

Airline pilots and surgeons rely on checklists to ensure bias is kept out of their decision making….

4. Surround yourself with multiple viewpoints.

A diversity of insights and information sources helps to reduce bias.

When you are surrounded by people with different life experiences, professional expertise and beliefs or world views, your decision making will be based on more inputs and become more immune to confirmation bias. Confirmation bias is our tendency to cherry pick information or viewpoints that match our own expectations or experiences…

Sometimes a leader must make a judgment call without the benefit of other viewpoints. In those moments, it’s important not to exhibit overconfidence bias. Overconfidence bias leads to a false sense of skill, talent or self-belief. For leaders, it can be a side-effect to their power and influence. Overconfidence bias shows up in illusions of control, timing optimism, and the desirability effect (i.e. thinking if you desire something, you can make it happen).

5. Learn to spot common cognitive biases…

Anchoring bias. Anchoring refers to using previous information as a reference point for all subsequent information, which can skew a decision-making process. Putting the original full price next to the markdown anchors our original perception of value as being the full price. Against that first piece of information, the sale price looks like a steal. But what if the wholesale cost of the item was first shown? The sale priced wouldn’t look so appealing.

Self-serving bias. Self-serving cognitive bias helps soften the blow to the ego when we make a poor decision by attributing it to poor luck. When things turn out well, though, we attribute it to skill or something else that was directly under our control. The downside to this bias in organizations, teams and leaders is that it does not produce a culture of accountability.

Herd mentality. As social creatures, it is hard to fight herd mentality. When there is consensus or a growing trend or fad, our gut is to move in the same direction as the herd. While this may feel like the path of least resistance or safer, it is a decision behavior based on emotion and not logic.

Loss aversionThis is one of my favorite principles: Avoiding a loss is a greater motivator than gaining a reward. This can lead to missed opportunities driven by risk aversion. You see it on game shows when contestants settle for the cash they’ve earned rather risking it for a much higher reward. Or in organizational cultures where the mentality of “keeping one’s head down” and analyzing things to death before an eventual decision by a committee is the safer route than the perceived riskier route of decisiveness and efficiency.

Reactance bias. While you might think that members of the public who defy face-covering recommendations or requirements are exhibiting overconfidence bias, they are more likely showing reactance bias, which leads to a fear that complying with one request will end in the restriction of future choices or freedoms.

Dunning-Kruger effect. This effect describes poor performers who greatly overestimate their abilities. Put another way, it applies to people who lack the competence to evaluate their own abilities. To overcome the Dunning-Kruger effect, your reports need to recognize their own shortcomings. If you can grow their competence, they will be able to make more realistic self-evaluations.

Narrative fallacy. Like the framing bias, the narrative fallacy bias appeals to our love of a good story. When the story is too good to resist, we get drawn in. Or, when faced with a series of unconnected events, we force them into a cause and effect narrative. It’s something we’ve been doing since before the ancient Greeks explained the sunrise and sunset as the god Helios pulling the sun across the sky in his golden chariot. Fight the urge to impose narratives where no real connection exists and look instead at what the data says.

Hindsight bias. Statements like “I knew it all along” indicate hindsight bias. It’s easy to feel and claim this after the fact, but the danger is that hindsight bias distorts our memories. We were unlikely to have been as confident of the prediction before the event as we appear to be after it. This can lead to overconfidence and a belief that a person can predict the outcomes of future events.

Read more at … https://www.prdaily.com/5-tips-for-identifying-and-avoiding-cognitive-bias-during-a-crisis/

BIAS & The Broken Ladder: How Inequality Affects the Way We Think, Live, and Die.

Commentary by Dr. Whitesel: UNC professor Dr. Keith Payne has written an important book for understanding how racial bias is splitting Americans. To understand how the Church can bridge this divide and bring people together, church leaders must first take a look at this well researched book. Here are some excerpt from an article the author wrote about this important topic.

The Truth about Anti-White Discrimination

by Keith Payne, Scientific American Magazine, 6/17/19. Dr. Payne is a Professor in Psychology and Neuroscience at UNC Chapel Hill. He is author of The Broken Ladder: How Inequality Affects the Way We Think, Live, and Die.

…News stories are full of statistical evidence for disparities between black and whites, such as the fact that the average black family earns about half as much as the average white family, or that the unemployment rate for blacks is twice that for whites, or that the wealth of the average white family is ten times the wealth of the average black family. But this kind of evidence is like a political Rorschach test that looks very different to liberals and conservatives. What looks to liberals like evidence of discrimination looks to conservatives like evidence of racial disparities in hard work and responsible behavior.  

The only kind of evidence that can hope to bridge this divide comes from experiments which directly measure discrimination — and these experiments have been done.

…Consider an experiment by sociologist Devah Pager, who sent pairs of experimenters—one black and one white—to apply for 340 job ads in New York City. She gave them resumes doctored to have identical qualifications. She gave them scripts so that the applicants said the same things when handing in their applications. She even dressed them alike. She found that black applicants got half the call backs that white applicants got with the same qualifications.

This study inspired experiments in lots of areas of life. One study, for example, responded to more than 14,000 online apartment rental adds but varied whether the name attached to the email implied a white applicant (e.g., Allison Bauer) or a black applicant (e.g., Ebony Washington). The black applicants were twenty-six percent less likely to be told that the apartment was available.

These kinds of experiments are not ambiguous like statistics on disparities are. There were no differences in merit. Race was the cause. Real employers and landlords discriminated against blacks and in favor of whites, by a large margin.

The key is to keep repeating the facts and their basis in reason and science, until they become part of the background that any conversation takes for granted. It is frustratingly slow work. But to even get started, we need to move the conversation about discrimination beyond evidence of disparities, and focus on the experiments and the stubborn facts they deliver.

Read more at … https://www.scientificamerican.com/article/the-truth-about-anti-white-discrimination/

ETHICS & The Role of Agape in the Ethics of Martin Luther King, Jr. and the Pursuit of Justice #FullerSemDissertation

by Jerry Ogoegbunem Nwonye, dissertation submitted to the Faculty of Fuller Theological Seminary, Pasadena, CA, 1/2009.

Available via ProQuest and at https://books.google.com/books?id=_0b6NTQGcKUC&dq=Where+Do+We+Go+from+Here:+agape&source=gbs_navlinks_s

Thesis excerpt, p. 2:

Ethics agape in MLK.jpgHashtags:  #WesleySeminary #DMinTL

BIAS & Why It is Hard to Grasp, When You Haven’t Historically Experienced It

Saturday Night Live, SNL, 11/12/16.

DIVERSITY & 3 Steps to Start Designing a Bias-Free Organization

Commentary by Dr. Whitesel: Subtle practices in language, hiring, promotion and programming in an organization can unintentionally lead to unintended and unexpected biases. Read this seminal interview with Iris Bohnet, director of the Women and Public Policy Program at the Harvard Kennedy School, cochair of the Behavioral Insights Group and author of the book, What Works, about how researchers have discovered how to foster a more bias-free organization.

Designing a Bias-Free Organization

by Gardiner Moorse, Harvard Business Review, July-August 2016.

Iris Bohnet thinks firms are wasting their money on diversity training. The problem is, most programs just don’t work. Rather than run more workshops or try to eradicate the biases that cause discrimination, she says, companies need to redesign their processes to prevent biased choices in the first place.

Bohnet directs the Women and Public Policy Program at the Harvard Kennedy School and cochairs its Behavioral Insights Group. Her new book, What Works, describes how simple changes—from eliminating the practice of sharing self-evaluations to rewarding office volunteerism—can reduce the biased behaviors that undermine organizational performance. In this edited interview with HBR senior editor Gardiner Morse, Bohnet describes how behavioral design can neutralize our biases and unleash untapped talent…

HBR: Organizations put a huge amount of effort into improving diversity and equality but are still falling short. Are they doing the wrong things, not trying hard enough, or both?

Bohnet: There is some of each going on. Frankly, right now I am most concerned with companies that want to do the right thing but don’t know how to get there, or worse, throw money at the problem without its making much of a difference. Many U.S. corporations, for example, conduct diversity training programs without ever measuring whether they work. My colleague Frank Dobbin at Harvard and many others have done excellent research on the effectiveness of these programs, and unfortunately it looks like they largely don’t change attitudes, let alone behavior. (See “Why Diversity Programs Fail” by Frank Dobbin.)

I encourage anyone who thinks they have a program that works to actually evaluate and document its impact. This would be a huge service. I’m a bit on a mission to convince corporations, NGOs, and government agencies to bring the same rigor they apply to their financial decision making and marketing strategies to their people management. Marketers have been running A/B tests for a long time, measuring what works and what doesn’t. HR departments should be doing the same.

What does behavioral science tell us about what to do, aside from measuring success?

Start by accepting that our minds are stubborn beasts. It’s very hard to eliminate our biases, but we can design organizations to make it easier for our biased minds to get things right. HBR readers may know the story about how orchestras began using blind auditions in the 1970s. It’s a great example of behavioral design that makes it easier to do the unbiased thing. The issue was that fewer than 10% of players in major U.S. orchestras were women. Why was that? Not because women are worse musicians than men but because they were perceived that way by auditioners. So orchestras started having musicians audition behind a curtain, making gender invisible. My Harvard colleague Claudia Goldin and Cecilia Rouse of Princeton showed that this simple change played an important role in increasing the fraction of women in orchestras to almost 40% today. Note that this didn’t result from changing mindsets. In fact, some of the most famous orchestra directors at the time were convinced that they didn’t need curtains because they, of all people, certainly focused on the quality of the music and not whether somebody looked the part. The evidence told a different story…

What are examples of good behavioral design in organizations?

Well, let’s look at recruitment and talent management, where biases are rampant. You can’t easily put job candidates behind a curtain, but you can do a version of that with software. I am a big fan of tools such as Applied, GapJumpers, and Unitive that allow employers to blind themselves to applicants’ demographic characteristics. The software allows hiring managers to strip age, gender, educational and socioeconomic background, and other information out of résumés so they can focus on talent only.

There’s also a robust literature on how to take bias out of the interview process, which boils down to this: Stop going with your gut. Those unstructured interviews where managers think they’re getting a feel for a candidate’s fit or potential are basically a waste of time. Use structured interviews where every candidate gets the same questions in the same order, and score their answers in order in real time.

You should also be thinking about how your recruitment approach can skew who even applies. For instance, you should scrutinize your job ads for language that unconsciously discourages either men or women from applying. A school interested in attracting the best teachers, for instance, should avoid characterizing the ideal candidate as “nurturing” or “supportive” in the ad copy, because research shows that can discourage men from applying. Likewise, a firm that wants to attract men and women equally should avoid describing the preferred candidate as “competitive” or “assertive,” as research finds that those characterizations can discourage female applicants. The point is that if you want to attract the best candidates and access 100% of the talent pool, start by being conscious about the recruitment language you use.

What about once you’ve hired someone? How do you design around managers’ biases then

The same principle applies: Do whatever you can to take instinct out of consideration and rely on hard data. That means, for instance, basing promotions on someone’s objectively measured performance rather than the boss’s feeling about them. That seems obvious, but it’s still surprisingly rare…

How can firms get started?

Begin by collecting data. When I was academic dean at the Harvard Kennedy School, one day I came to the office to find a group of students camped out in front of my door. They were concerned about the lack of women on the faculty. Or so I thought. Much to my surprise, I realized that it was not primarily the number of female faculty that concerned them but the lack of role models for female students. They wanted to see more female leaders—in the classroom, on panels, behind the podium, teaching, researching, and advising. It turns out we had never paid attention to—or measured—the gender breakdown of the people visiting the Kennedy School.

So we did. And our findings resembled those of most organizations that collect such data for the first time: The numbers weren’t pretty.

Here’s the good news. Once you collect and study the data, you can make changes and measure progress…

Read more at … https://hbr.org/2016/07/designing-a-bias-free-organization

PRIVILEGE & Understanding Unconscious Bias #HarvardUniversity

by Bob Whitesel D.Min.m Ph.D., 4/26/16.

Harvard University offers a helpful online test to help you see your unconscious biases that affect your opinions, language, friends, church preference, etc.  Understanding that we all, everyone, has unconscious biases. These biases are not all bad but it is important to understand that our upbringing and our choices have led us to embrace biases that we don’t even know we have.

Check out this resources at http://www.implicit.harvard.edu and take the short test.  You don’t have to share the results with anyone unless you want to.  The purpose is just to help you better understand yourself and how you can relate more authentically and openly with others.From the website: Participation is voluntary. It is your choice whether or not to participate in this research. If you choose to participate, you may change your mind and leave the study at any time. Refusal to participate or stopping your participation will involve no penalty or loss of benefits to which you are otherwise entitled.

What is the purpose of this research? The purpose of this research is to examine how people evaluate others.

How long will I take part in this research? Your participation will take up to 10 minutes to complete.

What can I expect if I take part in this research? As a participant, you will complete a decision-making task, answer some questions and complete an Implicit Association Test in which you will sort words or images into categories as quickly as possible.

What are the risks and possible discomforts? If you choose to participate, the effects should be comparable to those you would experience from viewing a computer monitor for 10 minutes and using a mouse or keyboard.

Are there any benefits from being in this research study? There are no foreseeable benefits for study participants. Scientific knowledge will benefit from a greater understanding of how people perceive others.

RACIAL BIAS & A 7-minute Video That Will Startle You: A Girl Like Me #HBO

Commentary by Dr. Whitesel:  “This documentary will open your eyes to what it feels like to grow up as a person of color in an America. Confirming the research of Kenneth Clark in the 1940s, this 7-minute video visualizes how people of color feel when growing up in a Caucasian culture.  Those of the dominant culture usually never realize the messages that are sent to people of color and so this 7-minute video is a must-view resource for Christian leaders.”

A Girl like Me, a 2005 documentary by Kiri Davis and Reel Works Teen Filmmaking (ABC News, 10/11/06 and the YouTube channel, youtube.com/user/mediathatmatters)

https://www.youtube.com/watch?v=YWyI77Yh1Gg

CULTURAL BIAS & The Power of Asking a Different Question in Ferguson

by Larry Wilson, 11/26/15.

Seeing Things in Black and White

… As I watched St. Louis County (Ferguson, MO) Prosecuting Attorney Robert McCulloch explain the grand jury action spliced with scenes of the gathering crowd outside the police station, I was struck by the obvious difference with which white people and people of color view events like the Ferguson shooting.

McCulloch’s calm explanation events centered on the facts of August 9. If we can understand these facts, he seemed to be saying, we can arrive at justice. Because that’s what whites are looking for in Ferguson: justice for what happened on that one day. Was this one officer guilty of a crime in the shooting death of this one black man? That’s the only question the grand jury was asked to consider, and the only question that matters to most whites, I think.

Reactions from people of color outside the room revealed that they’re asking a different question: How long must we live this nightmare? How many times will we watch this same scene unfold before someone recognizes the pattern and makes it stop?

And there’s the problem. People who ask different questions will seldom arrive at the same answer. Until we agree on the nature of the problem, there is no hope of finding a solution…

Read more at … http://www.lawrencewilson.com/the-power-of-asking-a-different-question-in-ferguson/

BIAS & Research Confirms Talking About Your Biases Can Help Reduce Them

Commentary by Dr. Whitesel: “This soon to be published research in Administrative Science Quarterly found that if people are reminded that everyone stereotypes others to some degree, then they will be more open to share their biases and as a result be more creative. In other words, let people know that everyone has biases and that we should not be afraid to discuss those biases. Doing so, rather than hiding our biases, fosters more creativity and problem solving.”

Study Says Creativity Can Flow From Political Correctness

“I think most people want to be unbiased, and there are ways we can try to make that happen.” – Michelle Duguid,a professor at Washington University in St. Louis.

by NPR staff, JANUARY 24, 2015 6:14 PM ET.

Michelle Duguid,a professor at Washington University in St. Louis and her co-authors set up an experiment to see if the notion that politically correctness impedes creativity held up to scientific scrutiny.

They sat down students in groups of three to brainstorm ideas on how to use a vacant space on campus. Some of the groups were all men, some all women, others mixed. Control groups got to start right away on the brainstorming, but the test groups were primed with a script.

The research team told those groups that they were interested in gathering examples from college undergraduates of politically correct behavior on campus. They were instructed to, as a group, list examples of political correctness that they had either heard of or directly experienced on this campus.

Duguid and her colleagues started another experiment, one that looked at stereotypes. They tested whether educating people about stereotypes would in turn reduce stereotypes. What they found was that by publicizing the fact that the vast majority of people stereotype, it actually creates a norm for stereotyping.

“People feel more comfortable expressing stereotypes or acting in ways that would be seen as inappropriate because it has set up this norm where everyone does it, so I might not be punished,” she says.

Duguid and her co-author tinkered with their message. Rather than telling the group that everyone was guilty of stereotyping, they simply told them that the vast majority of people put effort into not stereotyping.

“[It] actually had great effects,” she says. “It was the same as telling people that few people stereotyped. So that actually reduced stereotyping, and it was better, significantly better, than telling them nothing at all.”

For Duguid’s study, this was good news.

“I think most people want to be unbiased, and there are ways we can try to make that happen,” she says.

Read more at … http://www.npr.org/2015/01/24/379628464/study-says-creativity-can-flow-from-political-correctness

 

BIAS & Everyone is biased: Harvard professor’s work reveals we barely know our own minds

“Everyone carries with them implicit biases that may change how people perceive or interact with others.”

by Carolyn Y. Johnson, Boston Globe,
2/5/13

Mahzarin R. Banaji was starting out as an assistant professor of psychology at Yale University in the late 1980s, at a time when women professors were scarce enough that administrators eager to offer a class on the psychology of gender turned to her. Banaji had no expertise in the area; her research focused on memory. But she said she would do it, and she quickly found herself inhabiting the overlapping worlds of gender studies and psychology.

Banaji was fascinated by a memory study by psychologist Larry Jacoby. He had asked people to read a list of names from the phone book, such as “Sebastian Weisdorf”, and rate how easy they were to pronounce. A day later, those same people were handed a list of names that included famous people, others from the phone book, and some names from the list they had read the day before. Asked which were famous people, the study participants incorrectly classified Sebastian Weisdorf and others, whose names they had learned just the day before, as famous.

What, Banaji wondered, would happen if the name was Susannah Weisdorf? Would this same benefit, of becoming famous overnight, accrue to women? She did the test and found that female names were far less likely to achieve fame in the same way. When she grilled participants later, to try and figure out what could lie behind the discrepancy, she was struck by one thing: it occurred to no one that gender might be a factor.

That study was a seed, which grew into an idea in psychology that has become transformative: everyone carries with them implicit biases that may change how people perceive or interact with others. Doctors, judges, police officers, teachers—even Banaji herself—are all subject to these biases, which can lead people to inadvertently act in ways that may be discriminatory or are influenced by stereotypes that people would consciously reject…
Read more at … http://www.boston.com/news/science/blogs/science-in-mind/2013/02/05/everyone-biased-harvard-professor-work-reveals-barely-know-our-own-minds/7x5K4gvrvaT5d3vpDaXC1K/blog.html

BIAS & Is Bias Fixable?

“Recognizing bias is simply recognizing that you are not impartial — you prescreen by seeing what you expect to see.”

by Nilofer Merchant, 8/28/13, Harvard Business Review

“As a brown woman, your chances of being seen and heard in the world are next to nothing,” he said. “For your ideas to be seen, they need to be edgier.” He paused, as if to ruminate on this, before continuing. “But if you are edgy, you will be too scary to be heard.” This was the advice I got from a marketing guru when I asked for his help with titling my second book.

I was confused, as I couldn’t figure out how this answer had any relationship to my original question. I walked — somewhat dazed — to my next meeting and repeated what I’d just heard. In return, I received only blank stares. It wasn’t that these people affirmed his point of view; it’s that they stayed silent. My confusion gradually turned to fear. Was someone finally doing me a service by telling me … The Truth?

For months after hearing this “… you’ll never been seen” message, I was a mess seeing his “truth” into every missed opportunity or unexpected obstacle.

Black / white. Masculine/feminine. Rich/poor. Immigrant/ native. Gay/straight. Southern/northern. Young/old. Each of us can be described in a series of overlapping identities and roles. And we could spend time talking about the biological and sociological programming that causes humans to form personal identity around group structures. But the bottom line is this: we — as a society — don’t see each other. You are not seen for who you really are, though each of us is a distinct constellation of interests, passions, histories, visions and hopes. And you do not see others.

As David Burkus recently wrote, innovation isn’t an idea problem, but rather a recognition problem; a lack of noticing the good ideas already there. To see and be seen is essential to finding solutions for all of us. Now “noticing” doesn’t seem like an especially hard thing to do, but — let’s be real — it is. That’s because of bias. Bias is shaped by broader culture — something is perceived as “true” — and thus it prevents you from neutrally seeing. Recognizing bias is simply recognizing that you are not impartial — you prescreen by seeing what you expect to see…

Read more at … https://hbr.org/2013/08/is-bias-fixable/

PRIVILEGE & What White Privilege Means by Professor Naomi Zack #UnivOfOregon #NewYorkTimes #ReMIXbook

An interview by George Yancy, New York Times, 11/5/14

“Middle-class and poor blacks in the United States do less well than whites with the same income on many measures of human well-being: educational attainment, family wealth, employment, health, longevity, infant mortality. You would think that in a democracy, people in such circumstances would vote for political representatives on all levels of government who would be their advocates. But the United States, along with other rich Western consumer societies, has lost its active electorate (for a number of reasons that I won’t go into here). So when something goes wrong, when a blatant race-related injustice occurs, people get involved in whatever political action is accessible to them…

People are now stopped by the police for suspicion of misdemeanor offenses and those encounters quickly escalate. The death of Michael Brown, like the death of Trayvon Martin before him and the death of Oscar Grant before him, may be but the tip of an iceberg…

Exactly why unarmed young black men are the target of choice, as opposed to unarmed young white women, or unarmed old black women, or even unarmed middle-aged college professors, is an expression of a long American tradition of suspicion and terrorization of members of those groups who have the lowest status in our society and have suffered the most extreme forms of oppression, for centuries. What’s happening now in Ferguson is the crystallization of our grief…

Read more at http://opinionator.blogs.nytimes.com/2014/11/05/what-white-privilege-really-means/