user image

β€Ί it's-a me! πŸ‚

‍ ‍ ‍‍ ‍ ‍ ‍ ‍ ‍ ‍‍ ‍ ‍‍ ‍ (㇏(β€’Μ€α΅₯α΅₯‒́)γƒŽ) ‍ ‍

bookmarks:
alexithymia moi
blandine television (november 2024)
evelyn movies (2024)
anna currently
gigi 2024 (tv)

π•€π•¦π•žπ•žπ•’π•£π•ͺ 𝕠𝕗 ... πŸ“– Thinking, Fast and Slow Β· Daniel Kahneman (2011)

‍ ‍ ‍ ‍

part 1: two systems

  • in your mind there are 2 systems.
    • system 1 thinks without you even noticing. it drives on the highway, it sees weird shapes as letters in words, it calculates 1+1
    • system 1 does most of the thinking in our life. it is not always accurate. but it does so much that most of it ends up being at least okay, which is good.
    • system 2 is your conscious mind. it does advanced math, makes plans, and does everything else you actively think about
    • system 2 has a limit. if you are occupied with one task (e.g. counting passes in a basketball game) you might not even notice other things (e.g. a gorilla walking across the basketball court)
  • the lazy controller
    • system 2 is lazy. people who recently tried to stifle their emotions do more poorly on a test on physical stamina. ("ego depletion") this is not because they are physically tired, but because they are mentally tired.
    • humans are lazy. if they see a question in a quiz which has an "obvious" (but wrong) answer, most people don't check if it's actually correct, even if checking would only take a few seconds
  • the associative machine
    • we automatically (system 1) associate things. if you hear "vomit" and "banana", you think of a disgusting banana.
    • because of this you can become "primed". if you have recently eaten you are primed on food, and s**p means soup to you, not soap.
    • similarly, if you a primed on old people, you (without noticing) walk slower.
  • cognitive ease
    • if an answer seems familiar, we assume it's probably true
    • repetition is the most important thing for people to believe something. even if you hear "it is not true that..." many times, your brain might only remember the (wrong) fact and think it's correct
  • norms, surprises, and causes
  • jumping to conclusions
  • how judgements happen
  • answering an easier question
    • when confronted with difficult questions we usually answer an easier question without even noticing.
    • "is this president going to be a good one?" becomes "is this president likeable to me?"
    • "how happy am i in life?" becomes "how happy am i in this moment?"

‍ ‍ ‍ ‍

part 2: heuristic and biases

  • the law of small numbers
    • which states have the largest percentage of cancer patients? it's the small, rural, conservative states. which states have the smallest percentage? also the small, rural, conservative states.
    • smaller sample size always means it's easier to get extreme (low or high) results.
    • think of it like this: it's easy to roll 100% sixes with two dice, but hard to roll 100% sixes (the same percentage) with ten dice.
  • anchors
    • when asked "do you think ghandi lived longer than 120 years?" you will guess a higher number than when asked "do you think he lived longer than 50 years?"
    • this is also true when the anchor cannot possibly be seen as a hint, for example "did he live longer than 2500 years?"
    • this is because we start at the anchor and move down/up towards our guess, instead of guessing out of the blue, like we usually would
  • the science of availability
    • we judge the frequency/etc of things by checking how many of those come to mind and how easily they come to mind (= how "available" they are in our mind)
    • usually the ease is more important than the number
    • for example, political sex scandals come to mind easily, even if there aren't many. so we may exaggerate how many scandals there really are
    • however, if we are told that the ease is not important (for example if we are told that the music playing will make it harder to remember things) we think that the number of instances is more important
  • availability, emotion, and risk
  • tom w's speciality
  • conjunction fallacy
    • if you know "linda" is a social person who loves kids and is in the women's voters association, you might rate the possibility of her being a "feminist engineer" higher than "engineer" because it fits your views better. however, this is a logical fallacy, as "feminist engineer" adds detail to "engineer", is therefor a subset of "engineer", and can therefor be only less likely than "engineer"
    • the more detail you add to something, the smaller the possibility. however, its plausibility might be higher. (a huge wave caused by an earthquake is more plausible than any huge wave, however it mathematically has a lower possibility)
  • causes trump statistics
    • we tend to believe causes (causal stories) more than plain statistics. for example, if we are told a car in a hit-and-run has a 70% chance of being blue, we believe the explanation "70% of cars involved in accidents are blue" more than "70% of all cars are blue", even though it's the same base rate
  • regression to the mean
    • success usually is based on skill and luck. skill stays the same, luck varies. that means that successes vary: good on some days, bad on others, average on most
    • that means that after a great success usually comes a worse one (average or worse), and after a very unlucky event usually comes a better one (average or better), because the chance of this is higher. (average results after a really great success seem bad, average results after an unlucky event seems great.)
    • so if you had a great success and then a worse one, don't think your skill reduced. you simply had less luck. and once you're at the bottom, the only way is up.
  • taming intuitive predictions
    • to make good predictions you need to remember regression to the mean: if you think that someone who starts strong (e.g. many points in round 1 of a game) will keep going strong, you're forgetting regression to the mean. stronger-than-average players are likely to get worse (= more average), weaker-than-average players are likely to get better (= more average)
    • however, this will make it impossible for you to make extreme predictions like "this startup will be the next google". you would never predict it to happen, but in rare instances it does.
  • flow, halo effect

‍ ‍ ‍ ‍

part 3: overconfidence

  • the illusion of understanding
    • the halo effect: we tend to think that good things are mostly good, and bad things are mostly bad. that's why we think beautiful people are also nice, and why we think it can't possibly be true that hitler liked dogs.
    • when we hear a story we tend to connect it in a way that fits with the result, but we usually ignore (bad) luck as a factor. the more (bad) luck is involved in a story, the less we can learn from it! (example: when we hear how google came to be, it all makes sense: they were smart and made the right decisions. we will 'learn' something about how to make a company. but in reality it was more luck than anything else.)
    • when we say we "knew" what was going to happen, this is not correct. we thought that it was going to happen, and we turned out to be right, but we didn't know
  • the illusion of validity
    • even experts in their field (like political experts, or financial advisers on wall street) are only a bit better than chance at predicting the not-immediate future. their careers are mostly based on chance, not skill or knowledge. (though this doesn't mean that they don't have skill or knowledge. it's just impossible to predict the future)
  • intuitions vs formulas
    • simple formulas always do better than human intuition (even that of experts) because humans tend to add too much (irrelevant) detail.
    • e.g. asking a married couple how many times per week they fight will give you a better idea on the chance that they divorce than a psychologist talking with them for hours
    • if you want to evaluate job candidates, for example, do not just give them a score on how you feel about them. define ~6 categories (such as intelligence, sociability, reliance, ...) and score them (e.g. 1-5), then sum up the points. this will give you a more accurate result than just going by feeling.
  • expert intuition
  • the outside view
  • the engine of capitalism

‍ ‍ ‍ ‍

part 4: choices

  • bernoulli's errors
  • prospect theory
  • the endowment effect
  • bad events
  • the fourfold pattern
  • rare events
  • risk policies
  • keeping socre
  • reversals
  • frames and reality

‍ ‍ ‍ ‍

part 5: two selves

  • two selves
  • life as a story
  • experienced well-being
  • thinking about life
  • conclusion
jan 23 2022 ∞
aug 28 2024 +