We are living in interesting times. All countries are struggling to control the COVID-19 pandemic. Societies are challenging their histories of racial and social injustice. Fundamental beliefs are being scrutinized. Considering how we think and understanding our cognitive biases will be essential as we adjust to this “new normal.”
What we consider to be “true” may need to be revalidated. The saying, “May you live in interesting times,” is thought to be an ancient Chinese curse. However, this is not true. It is attributed to Sir Austen Chamberlain and was popularized by Bobby Kennedy. Its origin may be the Chinese expression, “Better to be a dog in times of tranquility than a human in times of chaos.” (Feng Menglong, 1627.)
The human brain is both amazing and flawed. The mind is capable of deep analytic and creative thinking. However, it is also hardwired to take shortcuts because logical thinking requires effort. These shortcuts speed decision making but are a poor substitute for deep contemplation.
As our lives transform, we will need to test our existing beliefs. To do so, we must recognize that our well-worn thought patterns limit our perspective. How should we judge the legacies of our national hero in today’s context? How do we envision a “with-COVID” future, realizing that we may never return to our pre-COVID lives?
There are over a hundred cognitive biases that influence our thinking. As we—individually and collectively—examine ourselves and our environment, we should be aware of the hidden pitfalls of blind spots, anchoring, and groupthink. Avoiding these traps will allow us to evaluate our beliefs and have more respectful and thoughtful conversations.
Blind Spots
Contents
Blind spots are gaps in our field of vision. When driving, we check them to avoid a collision. When it comes to decision-making, blind spots are things we overlook and fail to consider. The warning signs are present, but we ignore them.
Our preconceived expectations cloud our performance and interpretation of events. If we expect something (such as an exam or big meeting) to go well, it often does. The converse is also true; dread foreshadows poor outcomes.
Optimists and pessimists perceive the outcome of the same event differently—is the glass half-empty or half-full? Optimists are more likely to see the opportunities instead of focusing on the threats. Pessimists overestimate the likelihood of unfavorable consequences and underestimate positives ones.
When confronting threats, we seek the comfort of normalcy. When facing a disaster, we commonly defer immediate action and check with others before formulating our response. I was en route to my office on 9/11. Despite seeing smoke billowing from the Pentagon, I did not immediately turn around. I kept trying to call my wife, even though the circuits were busy.
The framing or phrasing of options affects the choices we make. Studies show that college students prefer avoiding a late registration penalty over receiving a similarly sized early enrollment discount. Patients favor positively framed options even when the outcome is the same as the negative choice. Recognizing the power of framing should inform how we present information to others, as well as how we evaluate the information provided to us.
The irony of cognitive biases is that we see ourselves as less biased than others. We also identify bias in others more easily than our own. Acknowledging our blind spots creates the prospect to approach these challenges differently.
When I managed a complex program, I recognized that I had potential blind spots. I asked a trusted and perceptive colleague to be my deputy. One of their responsibilities was to join me in critical meetings and survey our stakeholders’. After the meeting, we would debrief and strategize.
Anchoring
A ship’s anchor keeps it in place. In an agile retrospective, we use the metaphor to uncover things that are holding us back. Anchoring biases describe our tendency to stay rooted in an established position and our reluctance to change. We naturally search for information that confirms our existing beliefs. We selectively recall data that supports our view and reject contradictory evidence.
Anchoring biases impact us personally and also have broader societal implications. On an individual level, challenging the status quo is threatening. Engaging in constructive dialogue on complex issues such as racism or social justice is hard. We are reluctant to abandon discredited beliefs and tend to double-down when challenged.
As a society, America is polarized and fraying as a nation because we are so anchored in our beliefs. A centrist consensus existed in Congress through the 1980s; since then both parties have retreated into their own camps. Party affiliation drives opinions on coronavirus which should be an apolitical health issue.
“Don’t throw good money after bad” is timeless financial advice for avoiding anchoring biases. Rather than continuing to spend on a failing project, redirect new money in something that will be successful. We can apply the same philosophy to addressing our anchor biases.
First, recognize that you may be biased. Next, observe yourself. If you take positions where compromise is uncomfortable, examine them. Are there alternatives you might consider? What data supports the range of possible views?
Stephen Covey’s 5th habit was, “first seek to understand, then to be understood.” Actively listening to others instead of preparing your response creates the opening to grow. Conducting your own research rather than relying on pre-packaged arguments can create powerful revelations. Quite often, the “truth” is not absolute and requires us to appreciate a range of opinions.
Groupthink
Groupthink is prevalent in all organizations and occurs when too much value is placed on harmony and conformity. Differing and controversial alternatives are shunned. Ultimately, groupthink creates a dysfunction that blinds the organization to valuable viewpoints.
Organizations suffering from groupthink see dissent as “bad.” People are labeled as “with” or “against” us. Those in the latter group are seen as a threat and marginalized along with their ideas.
Following the group is the path of least resistance. The trip to Abilene paradox is a lesson in group conformity where family members join an unwanted excursion just to avoid hurting others’ feelings. Sometimes it’s easier to jump on the bandwagon and support an idea because others do. Unfortunately, we often suspend our better judgment in favor of the majority’s desires.
Leaders set the tone for their organizations. Unwittingly, they can foster groupthink by rewarding those their supporters. Leaders can minimize these effects by:
- Creating a safe space for subordinates to share their ideas;
- Encouraging others to share ideas before proffering their opinions;
- Using brainstorming and similar techniques that foster independent thought; and
- Creating teams with diverse backgrounds and perspectives.
The process of unpacking, scrutinizing, and adjusting our perspectives is difficult, but necessary. It requires a critical inspection of firmly held beliefs. Recognizing cognitive biases that inhibit this evaluation is insightful. Self-awareness can assist in this process. Revealing and understanding our biases opens the possibility of change.
© 2020, Alan Zucker; Project Management Essentials, LLC
To learn more about our training and consulting services, or to subscribe to our Newsletter, visit our website: www.pmessentials.us.
Related Project Management Essentials articles:
- Conducting a Successful Brainstorming Meeting
- Making Complex Decision
- Project Status is Subjective: Linguistic and Cognitive Bias
- Project Status is Subjective: Status Metrics
Image courtesy of: Thought Monkey