Book Cover

Today we begin the re-read of Thinking, Fast and Slow by Daniel Kahneman.  Not counting the endnotes, my copy has 448 pages and is comprised of an introduction, 39 chapters in five parts, and two appendices — if this were a blog the book would be approximately 41 separate entries, which is my current approach to the re-read (plus one week for a recap).  The chapters are, on average, relatively short, however, I am reticent to suggest out of the box that I will combine chapters during this re-read. Therefore, I am planning that this re-read to take 42 weeks. Kahneman’s writing, while engaging, is FULL over ideas that are useful for anyone that thinks of him or herself as a leader and change agent. As I noted last week, I will need your help calling out the parts of the book that resonates with you. If you do not have a favorite, dog-eared copy please buy a copy.  Use the links in this blog to books help to support the blog and its alter-ego, The Software Process and Measurement Cast. Buy a copy on Amazon. Now it is time to get reading!  

Introduction

These days, every time I start a new book I am reminded that once upon a time I did not read the introduction or front matter in books.  I suspect there are things I still don’t know or only learned about from the school of wandering into doors at night because of my choice,  When I originally read Thinking, Fast and Slow I was still in the habit of not reading introductions.

The central theme of the introduction is how the ideas that became the book were developed by Kahneman and his longtime co-contributor, Amos Tversky (Tversky died before they probably would have jointly won the Noble Prize). Kahneman describes the book as a “book about the biases of intuition.”  Over the years, we have written and discussed biases on the Software Process and Measurement blog and podcast many times, the obsession (my wife’s words) with biases is because biases impact both how people behave and how people make decisions. As leaders, anything that helps us to understand how people will behave and/or make decisions is a huge advantage for guiding change.

The introduction describes biases, to paraphrase, as systematic errors that happen predictably in specific circumstances.  Thinking Fast and Slow (TFS) helped me develop an interest in decision making and reignited my interest in economics as a useful tool to describe how real people behave. Kahneman’s and Tversky’s work was the basis for behavioral economics. Intuition, guided by bias, is the representation of heuristics (rules of thumb are a form of heuristics) that speed decision making. The problem is that fast and right are not always the same. Intuition, and by extension bias, work best if experience exists to aid in matching the decision-making situation to the right bias. Without experience, decision makers often mentally substitute a question that has experience with to speed the decision process. In today’s business world speed is often venerated over being right. As noted in the Introduction, Kahneman and Tversky established that intuition is often at odds with observable fact. The discrepancy between intuition and observable results is one of the reasons estimates are generally low fidelity deliverables.

One of the striking thoughts in the Introduction is the discussion of how heuristics generate biases. Understanding the causal relationship between heuristics and biases allows leaders and professionals to assess dogmatic beliefs.  Dogmatic beliefs are not always correct for every context. Once upon a time, it was a dogmatic belief that following a waterfall development lifecycle was the ONLY way to generate quality code. Once upon a time, it was a dogmatic belief that agile was the ONLY way to generate . . .oh, we have not discovered that truth is not absolute yet. While questions dogma can create huge improvement opportunities, challenging the status quo can be risky.  

Next week, Chapter One

Buy a copy of Thinking, Fast and Slow by Daniel Kahneman and read along.