Subconscious process is fast, active, effortless but not logical; conscious process is slow, lazy, effortful but can be logical
Limitations when subconsciousness takes control; and results from bounded rationality. Heuristics to avoid costly mistakes under complex circumstances that requires fast resolution
``A study of the incidence of kidney cancer in the 3,141 counties of the US reveals a remarkable pattern. The counties in which the incidence rate is lowest are mostly rural, sparsely populated, and located in tradiHonally Republican states in the Midwest, the South, and the West. What do you make of this?''
Insensitivity to sample size: ``Law of small numbers'' according to Daniel Kahneman: ``the law of large numbers applies to small numbers as well''
Coin tossing: if you have seen 0101010, will the next toss more likely to produce 1 or 0?
Tool X worked great in Projects A and B, should we use it in Project C?
There are no bugs in the three modules we have examined. Maybe the code is so good that there is no bug at all?
Pareidolia, Apophenia, and clustering illusion
How did it help survival?: False negative in recognizing a prey costs much more than a false positive.
All the books written about Steve Jobs, Elon Musk, Bill Gates. All the pundits talks about the great/disastrous performance of a football team
Also related availability
If a presidential candidate did great in one debate, she is more likely to disappoint in the next. People raise expectation by overanalyzing her success.
All starup founders believe they can beat the well-known, extremely low odds of success.
Fortune tellling is an example people try to make sense of random events.
We like conspiracy theories, trying to put an evil attention behind random events
Hanlon's Razor: ``never assume bad intentions when assuming stupidity is enough''
256 villagers in a valley.
What is the probability that at least one villager makes N consecutive predictions?
Result in Pseudocertainty_effect
We are not good at rational assessment with probability considered--->Back the envelope calculation
Zero-risk bias: People are willing to pay more to eliminate a category of risk than to reduce risk more overall.
people are willing to pay more than mathematically justified for really small odds: lottery,
If we observe H, is it A or B?
Bayesian classification (minimum error rate): It is A if P(A|H)>P(B|H)
As we are incapable of Bayesian, we approximate P(A|H) with P(H|A): If the probability of a member of A showing H is higher than that of B, we conclude it is A when we observe H. We don't bother with P(A|H)=P(H|A)*P(A)/P(H) by ignoring P(A).
There are two reasons: (1) incapable of probablity and (2) availability of data.
Examples: Rhyme-as-reason effect; Judge a book by its cover
The root of this limit is in our short-term memory and conscious system. The benefits of the heuristics include
Collect, remember, and interpret information according to one's belief. (Why certain people favor certain TV channels). Theory explaining confirmation bias: minimize cognitive dissonance. Conlicts affect the coherence of our identity and ability to work with others.
Conflict with one's belief can have three dimensions: (1) when the conflict happens: collection, recall, or interpretation of data; (2) the time scale of the belief: transient vs. long-lasting; (3) what is conflicting with the belief: facts, one's belief from the past, other people's belief.
We create a belief when we really want something to happen. We ignore data that points to that it may not happen
Quote from MMM: ``All programmers are optimists. Perhaps this modern sorcery especially attracts those who believe in happy endings and fairy god- mothers. Perhaps the hundreds of nitty frustrations drive away all but those who habitually focus on the end goal. Perhaps it is merely that computers are young, programmers are younger, and the young are always optimists. But however the selection process works, the result is indisputable: `This time it will surely run,' or `I just found the last bug.' ''
Why so true for software developers?: Overconfidence usually happens when people perceive the task is easy (Recall what we talked about the unique complexities of software systems early on) and perceive themselves as capable (Lake Wobegon Effect).
Planning fallacy. Recall Brooks's Law: adding manpower to a late project only makes it later.
See also Pro-innovation bias: Promote your own innovation without seeing its limitations/weakness
Incompetence to realize one's incompetence: Dunning-Kruger effect
We create a belief even when we hypothesize something may happen. We ignore data that points to that it may not happen
A common source for Experimenter's bias (Not all experimenter's bias caused by confirmation bias).
We create tiny, transient beliefs constantly as we receive information. Such micro-beliefs can affect how we make a decision.
When given a hypothesis, one is likely to search for evidence that supports it first--->Tend to give affirmative answer to question--->Survey questions regarding subtle issue can be biased
Information appeared early creates a micro-belief that affects processing of subsequent information--->Think about how to be positive about a person, product, or paper)--->Anchoring
Causes: availability bias (a limit of long-term memory); serial nature of reasoning (a limit of short-term memory)
When recollecting memory, (1) only recall things one likes to remember and (2) modify it as one desires. Recall that for long-term memory, we synthesize details!!!
Choice-supprtive bias, post purchase rationalization instead of buyer's remorse
False consensus effect
We make quick decisions/judgements based on information readily available, without seeking out more information. The more readily available, the more influential the information is on our decision.
When there is a dearth of information, any information, even a little bit, becomes critical in our decision making, often in the misleading way.
There is a lot of mental processes out of our control (conscious control).
The world shapes our mind not only at any moment but also accumulatively.
What's your favorite color? (Guess the most common answer).
We often mistake familiarity for quality, safety, appeal. An important source of stereotype.
The heuristic may help our ancestors survive: familiarity without harm, e.g., frequent encounters with a certain animal/tribe and frequent consumption of certain food, means safety.
Availability+ conflict avoidance: Misguided survey questions: Is the President doing a fantastic job? Is the President doing a good job? How is the president doing her job?
Availability + Conflict avoidance: Extreme opinions. we make confirming information more available by ignoring contradicting information. (Republicans like to watch Fox News while Democrats CNN)
Availability + Reluctance to accept randomness: conspiracy theories, books analyzing the success of Steve Jobs, political pundits explaining presidential debate performance
Availability+Incapable of probability: forget to ask for base rate