The Science of Successful Organizational Change

The Science of Successful Organizational Change

This week Steven dives into Chapter 5 of Paul Gibbons’ book The Science of Successful Organizational Change.  Cognitive biases are a topic that the Software Process and Measurement Blog has explored multiple times.  Cognitive biases are important decision-making tools.  Gibbons’ words have helped to crystallize our thinking on cognitive biases and logical fallacies in this chapter.   Remember to use the link in the essay to buy a copy of the book to support the author, the podcast, and the blog!   – Tom

Week 7 —

Week-7 re-read of Paul Gibbons book “The Science of Successful Organizational Change” (get your copy) is about “Cognitive Biases and Failed Strategies” and packed with useful information.  This chapter also concludes Part II of the book – Change Strategy.

Chapter 5 – Cognitive Biases and Failed Strategies

Considering VUCA (Volatility, Uncertainty, Complexity, and Ambiguity) once again, Gibbons makes the point only in volatile situations does decision-making actually need to be quick.  We are prone to quick decisions because one of our known cognitive biases (or logical fallacies) is an action bias (ready, fire, aim).  The outcome is often taking action before understanding the core of the problem – the infamous quick fix.  Gibbons message for Uncertainty, Complexity, and Ambiguity is to slow down the decision process and better understand the situation and what cognitive biases may be influencing the decision.

Why is it important to know and counteract our logical fallacies?

Gibbons presents a compelling case-study of where cognitive biases contributed to British Petroleum huge loss and environmental disaster from the explosion of the Deepwater Horizon oil rig in 2010.

  1. Overconfidence bias
  2. Zero-risk bias
  3. Action bias

“Here we see optimism’s dark side:  denial of expertise, anti-science thinking, mistrust of models, zero risk bias, “hope” versus Prudence, and people “trusting their gut” where hundreds of billions of dollars and many lives are at stake.” (pages: 130-131)

Cognitive Biases Affecting Change Strategy

Gibbons provides a summary of the most pertinent cognitive biases affecting business decisions (table 5.1, page 125).  Fourteen biases are summarized, including the chapter where they are written about (most are written in this chapter).  These fourteen cognitive biases/ logical fallacies are grouped into three categories, as follows:

  1. Perception Biases
    1. Overconfidence bias
    2. Deterministic fallacy
    3. Halo effect
    4. Ostrich bias
    5. Zero-risk bias
  2. Problem-Solving Biases
    1. Egocentric bias
    2. “Is-ought” fallacy
    3. Narrative fallacy
    4. Hyperbolic discounting
    5. Availability/confirmation biases
  3. Solution-Selection Biases
    1. Action bias
    2. Institutional bias
    3. Sunk-cost bias
    4. The planning fallacy

Counteracting Logical Fallacies

Gibbons provides some sound advice how-to counter-act these cognitive biases.

  1. Be aware of them, they affect you and the people you are working with
  2. Table 5.2 Killer Questions to Combat Perception Biases (p. 134)
  3. Table 5.3 Five Killer Questions Combat Problem-Solving Biases (p. 140)
  4. Table 5.4 Four Killer Questions to Combat Solution-Selection Biases (p. 154)

The use of questions to help people recognize their biases is significantly better than trying to challenge someone’s closely held belief. Questions are a great coach’s tool. – Tom

Three Cognitive Biases Examples

Each of these 14 cognitive biases rings true for me, but some I recognized right away in my personal life.

  1. The Ostrich bias – when I am off my diet, meaning consuming more calories than I know I should, the ONE place I avoid is the bathroom scale. However, when I am on my diet, paying attention and “winning”, I do not mind weighing myself several times a week.
  2. The narrative fallacy – we love our plans, stories, and research conclusions. I found this out not only planning projects but also in preparing for fantasy football drafts.  My narrative on how the top players would perform during the upcoming football season was shockingly off.  But before the season, and after spending hours of research, I started to believe my pre-season projections were facts and had to recalibrate my thinking very early in the football season.
  3. “Is-ought” fallacy – coming from an engineering background, I love to present facts and data. But it is the emotions and values that drive decisions, as every good salesman knows.Gibbons quotes an eighteenth-century Scottish philosopher, David Hume – “Reason is, and ever ought to be, the slave of the passions” (p. 139)Gibbons has an excellent point about people’s values and data – “No facts in the world will persuade a naturalist to value oil over biodiversity, nor an oilman to value birds over oil” (p. 139)

The Wisdom (or Madness) of Crowds

After Gibbons goes through these 14 cognitive biases / logical fallacies, group by group, and presents the associated “killer” questions to help counter-act them, there is a discussion about groups outperforming experts — “all of us are smarter than one of us” or the wisdom of the crowd.

A previous re-read had this same theme – “The FIVE Dysfunctions of a Team” (editor … link is https://tcagley.wordpress.com/2016/09/24/five-dysfunctions-of-a-team-patrick-lencioni-re-read-week-1/) specifically calls-out teams that “fear (constructive) conflict” which causes them not arrive at the best decisions.  Which supports research from Francis Galton – “the mean of social group estimates could be more accurate than those of experts” (p. 154).  However, Gibbons warns us about five social considerations and cognitive biases that can negate the “wisdom of the crowd” effect.

  1. Power differentials
  2. Anchoring effect
  3. Heterogeneity
  4. Group functioning
  5. Overconfidence

Summary of Chapter 5

Chapter 5 presents 14 key cognitive biases that are common in a business setting, along with ideas in the form of “killer questions” on how-to counter-act them.

“humans are likely to make systemic mistakes, we can course correct” (p. 156)

Chapter 5 also provided ideas on how to make the “wisdom of the crowd” decisions work in an organization.

Gibbons concludes chapter 5 with two key thoughts:

  1. More research is needed so we can better know when we can make decisions “trusting our gut” versus “all of us are smarter than one of us”.
  2. It is time for cognitive biases to become a standard part of the Change Management knowledge-base.

Question:  what cognitive biases / logical fallacy presented in chapter 5 do you recognize experiencing?

 

Previous entries in the re-read of the book The Science of Successful Organizational Change (buy a copy!)

Week 1: Game Plan

Week 2: Introduction

Week3; Failed Change

Week 4: Change Fragility to Change-Agility

Week 5:  Governance and the Psychology of Risk

Week 6: Decision Making in Complex and Ambiguous Environments