Book Review: Thinking Fast and Slow

Thinking Fast and Slow

Author(s)

Daniel Kahneman


My Rating

star star star star star

 

One topic that compels me in almost equals measures as programming is psychology. If I'm going to give up my free time to do something other than code, read about code or talk about code, it's probably to learn about psychology. Cogntivie psychology and the effect emotions have on our decision making, in particular, was always the niche that excited and fascinated the most as some of my old blog posts will attest.

Accordingly, Thinking Fast and Slow by Danny Kahneman was a piece I was always going to indulge in and even put my coding cravings aside for. And with good reason - this book is a classic. Not just for the psychology-intrigued like myself, infact moreso for the people who aren't aware of our emotion-driven irrationality. 

This writing is a masterpiece because it comprehensibly deliverys key human-behavioural insights from dense psychological literature into the mainstream. Of course, bringing terms into psychology pop-culture opens them up to misinterpretation, hype and commercialisation, but that's business as usual in the fad-churning programmer community.

Be sure to take a look at the Biases and Effects table at the end of this article for a reference of many of the key findings presented in Thinking Fast and Slow.

 

System 1 And System 2

Oversimplifying somewhat, current thinking in cognitive psychology is that the mind has two disparate agents guiding us through each moment. One of those agents, the much-maligned system 1, is a rapidly-responding pattern matcher who pressures us into instinctive decisions. In so doing preserving precious mental resources.

Unfortunately the quick-fire, instinctive actions of system 1 are lazy and troublesome. Our sucess, happiness and survival also depend on the more considerate and careful nature of system 2.

System 2 consumes more resources, according to Kahneman, but guides us into better decision making. If we used system 2 all the time we wouldn't be able to respond instantly to dangers in our environment and we may need more recuperation. 

Thinking Fast and Slow, the entire book, tells a story of how system 1 and system 2 are the basis of many irrationalities that are baked into the human mind - ubiquitous irrationalities that occur effortlessly in our everyday behaviour. By reading this book and acknowledging them, you can try to mitigate the negative impacts they have on your life.

 

Cognitive Ease, Confirmation Bias And Theory-induced Blindness

Of the many biases, heuristics and quirks of the evolutionary-shaped human mind, three of them stand out the most. I've detailed most from the book at the bottom of this review in a table. I'll discuss three here, but have a nosey at the table if you want a brief introduction to the others. You probably know many of them already, if not by name.

A primary teaching of Thinking Fast and Slow is that if something feels right, people will often follow that instinct believing it is right. Perhaps you're writing some code and feel an instinctive urge like: "I need to inherit class X here". Quite possibly you don't. Just because it felt right doesn't mean it is right.

To prove the questionable impact cognitive ease has on your life, I encourage you to not jump ahead into an implementation next time you are writing code. Instead, write down the problem and list out the possible solutions. Then detail their pros and cons. I've been doing this for years and I continually feel shock at how my gut instinct was leading me into sub-optimal decisions.

Then there's confirmation bias to prove how irrational you really are. People subconsciously look for information to confirm their view of the world. In fact they interpret information, Kahneman claims, to fit in with that view. Keep this in your mind, but don't keep a score of how many times you catch yourself being caught out by confirmation bias because you will hate yourself.

If those two fundamental traits of human behaviour do not make you realise how fallible you are, then theory-induced blindness should. Kahneman discusses how after many years of being treated as undeniable gospel by the upper echelons of cross-discipline academia, a simple observation proved expected utility theory to be a fundamentally flawed view of human behaviour (though not entirely wrong). What is most striking is that the most intelligent people could not see such an obvious flaw and still actively disregarded the flaw and continue their unshakable belief in expected utility theory.

Ironically, theory-induced blindness highlights a striking weakness in human cognition that shows how incredibly vulnerable all of us are to being unthinkably wrong.

Mostly a Masterpiece

Although I described this book as a masterpiece I still had a few odd annoyances.

For the most part Kahneman's conversationl, first-person writing persona flows smoothly despite being a bit verbose at times. I also thought Kahneman did well to delineate his opinions and facts. Though in some places he did not. For example, in his discussion of regression to the mean it feels a bit like preaching without enough justification and insight to convince the reader.

Much of the final parts of the book are dominated by economics. A lot of the chapters just seemed to blend into each other and I found myself skipping the odd few pages. I wasn't enjoying what appeared to be plain repetition. But perhaps that was just me.

The first part of the book is almost untouchalbe, though. If you don't find the time and effort to it read it you'll probably never know how fallible you really are.

 

Biases and Effects

Term Definition Chapter
Affect heuristic If you have positive feelings about something you will perceive it's benefits to be greater, but if you feel negative you wil perceive the benefits to be less. 9
Anchoring effect The first number of piece of information we see sets an expectation. For example, negotiating a salary can be influenced by the first offer. Even subliminal, unrelated values can cause anchoring.  11
Associative activation / associative coherence Priming effects - when we see or experience something we activate related information. This can cause us to reinforce our beliefs by looking for information that aligns with our existing mental connections. 4
Availability cascade Self-sustaining events that snowball into major events and appear more important than they are. 13
Availability heuristic

The ease with which instances of an event come to kind makes you overestimate its frequency. For example, you saw a car crash and became fearsome of driving because you think a car crash is likely to happen to you.

12
Cognitive ease The faster an answer or thought comes to mind, the greater the confidence we will have that it is correct. This is an example of an emotional decision drive by system 1. However, instintive thoughts are often misleading and can lead to poor decision making. 5
Confirmation bias Interpreting events based on past experience - trying to make the world conform to our existing beliefs by acknowledging, seeking or interpreting information that supports our beliefs and rejecting or ignoring information that does not. 7
Conjunction fallacy People rate more specific events as more likely - a logical error - the Linda problem. 15
Disposition effect A manifestation of loss aversion where bankers would rather sell stocks that make a profit than a loss, even if it is a sub-optimal decision overall. 32
Duration neglect People judge experiences based on the ending or the moment of peak intensity rather than the overall experience. 35
Ego depletion Effort or self-control is tiring. It’s difficult to perform multiple successive tasks (fatigue) even if different - e.g. mental, physical. Ego-deflated people are more likely to quit trying and let system 1 run the show. 3
Exposure effect Seeing something more frequently makes it seem more likeable - even things that are not consciously perceived 5
Framing effect Different ways of presenting the same information evoke different memories or responses.  7
Halo effect Tendency to like or positively-interpret everything about a person and what they do if we like them.  7
Hindsight bias Changing past beliefs based on what happens - and not remembering our past belief 19
Intensity matching Quantifying things of different types e.g. sadness and money. This can lead to poor predictions. 8
Lady Macbeth Effect Feel dirty inside makes people want to clean themselves on the outside 4
Law of least effort People often choose the option that requires the least effort. It should be efficient but it can also be lazy and prone to error 3
Loss aversion We are driven more strongly to avoid losses than achieve gains, leading to sub-optimal, statistically-poor decisions 28
Mental shotgun It is hard to direct system 1 because it processes so much  8
Moses illusion In a statement about Noah's ark, people do not realise Moses is mentioned in place of Noah. This highlights how we can overlook incorrectl but semantically similar details. 6
Narrative fallacy People try to explain events with stories that make them sound rational and predictable, even when they are not. 19
Outcome bias Decisions are judged on their outcomes, not the quality of decision making.  19
Planning fallacy Estimates will be closer to a best case scenario rather than a realistic one in situations where statistical evidence could have been beneficial. Probably because thinking about all of the factors that can elongate an estimate are too numerous to quantify. 23 
Possibility effect The fact something is even possible leads to irrational weighting of it's probability  29
Positive test strategy Deliberately searching for evidence to support an idea rather than rationally evaluating 7
Preference reversal People can illogically change preference for something if they gain further details, highlighting the differences between decisions made by system 1 and 2.   33
Probability neglect Emotional reaction overpowering rational likelihood of an occurrence. e.g. you saw a car crash and became scared to drive because you thought it was going to happen to you. 13
Representativeness heuristic Making a judgement based on our mental perception rather than the logical likelihood 14
Regression to the mean Deviations in performance can be explained by normal fluctuations rather than causal influences. e.g. an athlete's improved score may be explained by a return to usual performance levels - not a their trainer berating them after their previous below average attempt. 17
Subjective confidence Being confident in times of uncertainty due to strong coherent feelings which may be misleading. 20
Sunk costs fallacy The preference to keep investing in a loss in the hope it will come good, even when the chances are small. A manifestation of loss-aversion. 32
Theory-induced blindness Once you accept a theory and begin to apply it, you fail to see its weaknesses. This shows difficulty in un-believing. 25
What you see is all there is (WYSIATI) System 1 quickly jumps to conclusions based on available evidence and does not take time to consider what may  not currently be in mind. To avoid problems of this nature try to engage system 2. 7

You may also like...

Speak So Your Audience Will Listen

The Black Swan

Drive

My Books

Designing Autonomous Teams and Services
Patterns, Principles and Practices of Domain-Driven Design
The Strategic Practices of Domain-Driven Design