Book Review: Thinking fast and slow, by Daniel Kahneman


This is the first book on psychology that I have read and I must say it is some heavy reading - it took my almost 5 months to finish it!! Its an amazingly well written book, quite simple in its language but just because the content is so much and so deep, it takes a longer time to mull over it and finish it! Its written by Daniel Kahneman, a Nobel prize winner, and his works cover both economics as well as psychology.

The overall foundation of the book is that the human mind is not always rational. There are biases which affect how it works and hence its important to be aware of its failings. The book is split into 5 sections covering a lot of these tendencies, each section then split into small chapters of 10 pages. Each chapter in turn covers a small part of how our brain functions, detailing it with multiple examples and experiments from daily life which ring true and bring the theory to life for an amateur reader.

The book is very very interesting, especially all the examples from daily lives and the examples of how humans make (incorrect and inconsistent) decisions (I wish I had noted more examples from the book rather than just the theory;-) ). And I also wish I could remember and take lessons from the book into my life. We all behave irrationally and take inconsistent decisions so many times, that we can really benefit from understanding how to identify these and avoid making the standard mistakes.

I am sharing some important titbits and insights from the book. This is not a comprehensive summary but gives a good sneak peek into what the book is all about!

(Adding a huge disclaimer - this article is based on my own understanding of what is written in the book and sometimes based on 5 month old memory. So please excuse any mistakes. They are purely mine, not the book’s!!)

Part 1: Two minds - Quick and confident System 1 vs Careful and lazy System 2

- The author starts with splitting the human mind into two - 'System 1' which is the instinctive, fast, and reactive side of the brain, prone to biases and quick judgement and then the 'System 2' which is the slower, time consuming side, which thinks deep and takes time to decide. Both parts of the brain have different roles to play in life and different areas where they exert control. System 1 is the immediate reactor, while System 2 is responsible for self-control. System 2 is however generally lazy and lot of times decides not to take control. It takes whatever System 1 gives it unless there is some reason not to. And lot of times when it doesn’t take over, lapses happen

- Associative thinking (also called 'priming') affects our decisions more than we would expect. For example, if you've just read the word 'EAT', you would complete SO_P as SOUP but if you've just read WASH, it would be SOAP!! (its so true!!!). Our mind behaves differently based on its previous experience and we need to understand it to fully learn its biases

- We experience greater cognitive ease in perceiving a word we have seen earlier, and it is this sense of ease that gives us the impression of familiarity. A reliable way to make people believe in falsehoods is to frequently repeat it as familiarity is not easily distinguished from truth. If something is repeated often or displayed clearly or primed (shown just prior) or if we are in a good mood, its easier for the mind to accept such information. And then it feels more familiar, truer, better and effortless. System 1 is responsible for this ease. However if conditions are not that comfortable or good, System 2 comes into play and then we become more careful and hence make more correct decisions/judgements

- System 1 is also tuned to form biases and judgement based on whatever information it has, without really waiting for all information - the confident System 1:). This is called, WYSIATI - ‘what you see is all there is’. It creates the whole story, ignoring the facts which were missing, and then makes judgements based on that, that is the way it is designed to function!

- The mind works like a 'mental shotgun' - answers a lot more questions than need to be answered. This sometimes affects its answers in way it shouldn't. For example, a lot of voters decide who they will vote for based on the looks of the candidates. They relate say a stronger chin to a good leader and vote on the basis of that!

- System 1 jumps to conclusions a lot more based on prior/recent experiences, and is quite prone to the halo effect - first positive/negative impressions cloud whatever comes next. The best way to deal with this is to de-correlate error. For example in a meeting, get individual thoughts on the table before discussing these ideas with everyone. Because if you start discussing ideas as they come about, a positive comment from anyone may affect others’ perception of that idea and then the overall decision

- Lastly System 1 also substitutes questions which are difficult to answer with simpler questions they can answer. (System 1 doesn't like not having an answer! Substitution, WYSIATI are both examples of that!). For example, to answer a question ‘how popular will the president be in 6 months’, the mind substitutes it with ‘how popular is the president right now’ to answer the question. This substitution happens automatically based on information System 1 already has. And System 2 is not even activated as System 1 has already answered the question (though incorrectly)! Clearly system 1 is a lot more confident than it should be and is the cause of errors in complicated situations

-'The pupils are the window to our soul and dilate when our mind does heavy thinking' (random quote I liked!)

Part 2 - Statistics and the (errors of the) human mind

- System 1 does not accept randomness so easily and tries to find patterns/causes every time (even when there isn’t!). This is based on the human survival instinct but doesn’t help as much now in today's life

- Plausibility and probability are often confused by the brain! For example. when asked which of the following two statements are more probable, ‘Jane is a teacher. Jane is a teacher and walks to work’, people have a tendency to attach a higher probability to the second. However, the second statement is more plausible, not necessarily probable!

- People learn statistics and prediction based on actual individual experiences rather than statistical data they learn! (Which in some way suggests that teaching psychology is a waste!). For example, assume you know that only 20% of people will try to help a choking person in a restaurant. You are then shown a video of a restaurant with a choking person, and asked to predict whether his neighbour will try to help him or not. And this is done 5 times. What would you predict? Would you say in only 1 of those 5 cases that the neighbour will help the choking victim? Most probably not!

- Also whenever the mind is asked to think of statistics, it estimates the answer based on the frequency of recalls rather than quality (i.e. how easy was it to recall an incident of the type). This is called the availability heuristic. For example, at a time when there has just been an aeroplane crash, people would estimate % of planes crashing a lot higher than when there hasn’t been a recent incident. Also, when asked about % of celebrity divorces as compared to regular divorces rates, people are likely to estimate a higher % for celebrities than actual because of all the media publicity they get and how easily the mind can recall a higher % of those instances

- Regression to the mean is something System 1 cant understand. It looks for causality for changes in performance when all it is is regression to the mean performance from the unusually good or bad performance. And luck is a huge part of unusually good or bad performances. For examples, golfers and pilots have good and bad days, and their performances follows a regression to the mean eventually

- Our mind very easily gets anchored in the numbers in front and starts thinking with them as benchmarks. For example, if a group of people is asked the following two questions one after the other ‘Was Gandhi more or less than 144 years old when he died? How old was Gandhi when he died?’, the results would be very different than if the questions were ‘Was Gandhi more or less than 50 years old when he died? How old was Gandhi when he died?’

Part 3 - (Misplaced) confidence of the human mind

- This part of the book talks about how the human mind's confidence may not always be based on the right foundation. The human mind has a tendency to draw stories and causalities when there are none. It also builds up a story of the past based on hindsight and the outcome (called outcome bias). Irrespective of what you thought would happen before the event, after the event this changes to what actually happened. And the human mind thinks that that was what they always predicted.  One example (or disadvantage) of this is how much value we put to CEOs whose decisions went well. It is majorly luck and only a part of its success is based on the CEO's action but we are much affected by hindsight and the outcome bias. If the outcome is positive, its all attributed to the CEO irrespective of what we thought before

- (Expert) Confidence in predicting is totally not based on reality when it comes to investing in stocks, diagnosing future performance etc.. Experts barely say and know anything better than what an algorithm may predict. So their performance /expertise are not really better than a non-expert, as the mind cannot be able to objectively predict

- Overconfidence about plans is often rewarded heavily in organisations though these plans don’t always perform upto expectations. Hence, doing a pre-mortem is a good way to get a realistic view of the risks involved in any major decision. When a major project has almost been approved, the organisation should get all the relevant subject experts together and ask them to answer the question - ‘Imagine a year later that the project has failed. Write a story on how that would have happened’. This is the best way to uncover risks previously not considered

- However, it is not that experts cannot be trusted at all - they can be trusted in a stable predictable environment when they can practice enough and get feedback soon after their decision, to learn from its validity. Like in firefighting. However, in financial investing, they don’t get feedback soon enough and hence should not really be considered experts (the author's opinion, not mine ;-) )

- Statistical rules are superior to intuitive clinical judgements. This was tested in interviews for the Israeli army and the results confirmed this. Going for the gut feel whether a candidate is suitable or not did not initially yield good results. But then once the interviewer was asked to objectively measures candidates on certain dimensions, the results got a lot better in predicting future performance. (And I can say that in my firm too, we are never asked our opinion of hiring a candidate. We are asked to objectively rate people on different criteria. And we use a formula to come to the final result. And the results are not always the same as how we feel about the candidate!!)

- But contrary to the findings above, people have an aversion to algorithms and formulae affecting decisions in their life. They would take the chance and let the mistake be a human mistake rather than accept that it was a formulaic or algorithmic one

Part 4 - Rational humans as per economic theory don't really exist!

- People's utility from additional wealth is logarithmic and also based on the initial reference point. It is not linear as suggested by economic theory. So a person increasing his wealth from $100 to $200 does not experience same utility as someone increasing his wealth from $1000 to $1100

- For people, loss aversion from losing a particular amount is higher than winning the same amount. For example, people behave differently when they can gain x amount than when they can lose the same amount. Their preference to gamble/not gamble changes. (Of course in every situation, the  starting point in terms of wealth is a very important factor in a person's behaviour and the answers to the above may change on how wealthy you are)

You can test it out for yourself, which of the options in each of the two statements below do you prefer?
Get $900 for sure or 90% probability of winning $1000
Lose $900 for sure or 90% probability of losing $1000
Suprised?

- We overweight low probabilities (ie when it increases from 0 to 5%) and hold onto hope and be risk taking in such a situation. However when probability goes from 95% to 100% we become risk averse and are willing to settle for less

- Once you own something, its tougher to give it up, than if you didn't own it. Ownership leads to loss aversion kicking in, making one's choices irrational. For example, if you are asked to chose between a book and a mug, you might choose either of the two. However, if you own the same book and are asked to trade for the same mug, the number of people who would do that will reduce considerably. This is known as the endowment effect

- Adding a vivid representation to an outcome reduces the role of probability in evaluation of the (uncertain) prospect. For example when asked the two questions below, the responses of a sample of people were different - ‘How much would you be willing to pay to avoid a 21% chance of having to paint someone’e 3 bedroom house? How much would you be willing to pay to avoid a 21% chance of having to paint the dormitory toilets?’

- People respond more to frequency than probabilities and this is used to manipulate people! For example with a jury, a lawyer is likely to say ‘1 in 1000 cases, this is defective’, rather than ‘in 0.1% of cases this is defective’, as the first is likely to seem larger than the second!!

- Risk taking should be done by broad framing - you win some and lose some, rather than looking at every small gamble/investment. (Think like a trader!)

- Our brain keeps a score of wins and gains. Hence we are willing to sell stocks which made a profit rather than one at loss even if net price for both the same. Also sunk cost fallacy makes people invest more into losses to keep a positive score for a particular action. And lastly, action based regrets at loss are higher than inaction based. For example, what would you regret more - Investing in a stock and losing $100? Or not investing in a stock you wanted to, which would have yield you $100 profit? The answer makes it clear:)

- Framing of questions has a big effect on our decisions. For examples, in countries where drivers are asked to opt-in to donate organs in case of an accident have a 5-10% sign-up rate, whereas countries which ask for drivers to opt-out of organ donation have a 80-100% sign-up rate! Depending on the question you ask, people's response is different and this can so easily be used to manipulate people's decisions!

- Since humans don’t always behave rationally, some governmental 'nudges' towards the right decision should not be unwelcome. For example, automatic opt-ins for pension will lead to overall betterment of society even though some sides may feel this is an infringement of people's right to be completely independent and make their own decisions

- Bad events are remembered more than good events by the human brain

Part 5 - Our two selves - the Experiencing Self and the Remembering Self!

- We have two selves, the experiencing self and the remembering self, and both work differently. The actual experience and our memory of that experience are not always the same. For the memory, the duration of that experience is not important. It is based on specific moments of the experience and hence affected by only the peak as well as the moment before the end.

What this means is that when we rate an experience, say a vacation or a painful event, we rate it on the basis of the peak emotion felt (pain, happiness etc) as well as the feeling just before the experience ended. If say a surgery was painful for 2 hours but the last 30 mins were less painful, is compared to a surgery which was as painful, lasted for only an hour, but was as painful the entire hour - the human mind would rate the first as a less painful surgery!! This is because the remembering self remembers the peak pain (which was the same) and the end (which was less painful in the second case) and makes its memories accordingly!

This was quite eye opening for me. And the learning (though am not sure I agree with it) is that 'We should maximise the quality of our future memories, not necessarily of our future experiences!'

- Well-being again is not necessarily same as how happy one is. Well-being is spending time on things we enjoy as much as possible, even if what we remember later is a modification of this experience!

- There were a couple of good insights/quotes in this section based on their research:
'Happiness/well-being comes from the experience of spending time with people you love and who love you'
'Being poor makes one miserable but being rich only enhances one's satisfaction with life, not really improving the experience of well-being'
'After 75k income, there is no more increase in well-being with increasing income, because it reduces time to enjoy smaller pleasures in life!'
'Our remembering self has a bias to fear a short period of intense but tolerable suffering than longer period of moderate pain'

-----------

I would rate this book as a great one. If you found this article interesting, go get one for yourself. It has a lot more to offer than what I was able to cover in this 'short' article:)



Comments

Popular posts from this blog

Movie Review : 3 Idiots

Some of my favourite artists

Delhi Daredevils are back!