Book Cover

Daily life seems to be a combination of action and inaction of getting things done or listening to people tell you about what they have done, but the constant is that everyone is keeping score. Perhaps not consciously, but everyone has a set of mental accounts. Chapter 32 explores how those tallies shape preferences and motivate actions.

The match behind our mental accounting is predominantly handled by System 1 Thinking. Something happens and our fast thinking component (with all of its biases) slots everything into T-accounts of debits and credits (my mental picture). In my undergraduate education, I took a lot of economic classes (in fact I ended up having to take a graduate class or two to get all my electives in) we learned that people were rational and the utility theory ruled. The idea that mental accounts affect economic behavior and decision making conflicts with utility theory. In the book Kahneman uses the example of two people with tickets to an event — they both really want to go to the event. One ticket was purchased and the other was a gift. If a blizzard makes getting to the event treacherous, utility theory would have both people write the tickets off as sunk cost. Experimental data shows something different. The person that paid for the ticket feels more negatively than the one who was given the ticket and would be more apt to brave the storm. Different debits to their mental accounts generate different behavior.

An example of the impact of mental accounting is often seen when a basic agile premise goes haywire. The agile mindset embraces experimentation. Unless practitioners are careful, experiments can clash with mental accounting and loss aversion biases of System 1 Thinking. Trying new things to determine to improve flow and value delivery can have two outcomes – it works or it doesn’t. As we have seen, people tend to be loss averse, if you suggested or invested in the change, it is easy to see how System 1 would provide the inputs to keep trying even when it is obvious the idea has failed. The better answer is to call the experiment, learn, and try something else. 

Kahneman also tackles the idea of regret in this chapter. He defines regret as “occurring when there are alternatives to reality.” A common euphemism, “coulda, shoulda, woulda” is useful to describe a scenario when regret pops up its ugly head. Regret is just another form of sunk cost with the added complication that you can anticipate it. The ability to anticipate the potential for regret negatively biases decision making and tends to reinforce the status quo. “When you deviate from what is normal you can easily imagine the norm, and if that default associated with bad consequences, the discrepancy between the two can be the source of painful emotions.” The research shown in this chapter shows that people feel more regret when their actions lead to a negative outcome than when their inaction leads to a less than perfect outcome. The term used to describe this bias is asymmetry of risk of regret. This bias reinforces the status quo. 

Process improvement, change, is difficult in most circumstances. The anticipation of regret thumb on the scale leaders’ mental accounts has stopped many change programs. When you are trying to build the imputes for a change you will need to tackle the fear of regret each or risk being shut down. 

Remember, if you do not have a favorite, dog-eared copy of Thinking, Fast and Slow, please buy a copy.  Using the links in this blog entry helps support the blog and its alter-ego, The Software Process and Measurement Cast. Buy a copy on Amazon,  It’s time to get reading!


The previous installment of Re-read Saturday:

Week 31: Chapter 31: Risk Policies 

Or start at the beginning with 

Week 1: Logistics and Introduction –