Sunday, July 5, 2009

Big Decisions, Big Mistakes

Here's an interesting lecture by Daniel Kahneman, a psychologist who is a Nobel Prize Winner in Economics. The talk is called: The Psychology of Large Mistakes and Important Decisions.

Skip the first 8 minutes which is just two people "introducing" Kahneman. Here are the points made by Kahneman I found interesting:
  • Big and important decisions are actually the most irrational ones we make. We are good at small and unimportant decisions because he have a lot more experience with them. The big decisions are rare, so we have less expertise to utilize in making the decision. He mentions buying real estate (wow! that sure is relevant given the experience of the last year or two!).

  • He discusses Nixon's 1969 "decision" to bomb Cambodia. He points out that Time magazine wrote that up as a "gamble which only history will show if this is a mistake". He points out that this is completely wrong. A decision is based on an expert opinion about outcomes and assigning probabilities. So having a bad outcome does not mean a decision was bad. It simply reflects that the odds fell on the wrong side in that case.

  • He discusses a decision by the Israeli Air Force to shoot down a commercial Libyan airliner. The decision maker failed to make the decision while the plane was headed toward Israel's nuclear site, Dimona. Only after it turned and headed toward Egypt and just as it was about to cross into Egyptian airspace, the decision maker "decided" that the airplane was an enemy. Kahneman points out that in decision making you are not supposed to "decide" the state of the world. Your decision is about interpreting events and assigning probabilites. If you stop this and "decide" on the state of the plane is an enemy, then you have to shoot it down. This was wrong because it was in fact a commercial airliner. Decisions needs to always be done in the context of possible interpretations and what tradeoffs are available, not defining the "state of the world" which short circuits decision making.

  • He points out that there are two kinds of "cognitive style". There are "foxes" who know many things but none really well. There are "hedgehogs" who are deep experts on one big idea but not very knowledgeable about other fields. The foxes are much more successful at making accurate forecasts. (But neither is actually all that good at forecasting. It ends up that "experts" are not much better than ordinary people.) It ends up that being an "ideologue" and being a hedgehog makes you especially poor at forecasting! (This is funny because the US's media is filled with conservative ideologues -- hedgehogs -- who pontificate. But the research shows these are absolutely the worst at predicting the future!)

  • He talks about how forecasting "experts" get caught up in "hindsight bias", the belief that something was more obvious that it was once it has happened. He cites work by Philip Tetlock where counterfactual thinking "anchors" their beliefs. He uses the example of the fall of the Soviet empire. Ideologues on the right are convinced that Reagan's military buildup clearly led to the Soviet falls. He points out that this is an example of using facts to nurse a counterfactual thinking that reinforces their ideology.

  • In our decision making we fall prey to the fundamental attribution error: we see others action on the basis of the "true nature" but we see our own behaviour as more "reacting to events" and therefore more varied less obviously driven by our "true nature". So you see your enemies as plotting everything and all actions are calculated, but your own actions are generally innocent because they were a reaction forced on us by outside events. This biases our interpretation of behaviour.

  • He points out that big decision makers have "delusional optimism" when they make decisions. He talks about people who start new businesses. They think they "can't fail" and are ignorant of the fact that roughly two-thirds of all new businesses in the US fail in the first five years. He also talks about "final offer arbitration" where each side has to make a proposal where the arbiter has to take one of the offers. The participants, when asked what are the probabilities that the arbiter will take one or the other proposal, their estimates generally total up to 140%. That is, they over-estimate the probabilities. He talks about military leaders over-estimating the consequences of going to war. (He claims that no military leader can be realistic about going to war and then going to war. He even mentions Japan. But it is well known that Yamamoto, the admiral who planned the Pearl Harbor attack knew very well that Japan was likely to lose the war because he had visited the US and realized how large, industrial, and powerful it was. So here is a case where Kahneman doesn't know his facts.)

  • People exaggerate their skills. He also mentions the well know bias that people over-estimate their abilities, i.e. 85% to 90% people think they are above the median. People also exaggerate their degree of control over situations. Decision makers don't feel that they are gamblers. They think they are "controlling" events.

  • It is funny that he talks about large organizations who now have "risk officer" and "risk department" to evaluate the risks that the organization is facing. He points out that most organizations have the formal structure but in reality the CEO ends up being the only real risk assessor. This is particularly funny because this is exactly why Wall Street collapsed. Their risk management teams "existed" but weren't taken seriously by the CEO.

  • He gives an example of pseudo-certainty that decision makers fall victim to. Consider three cases:
    1. There is a flu that will infect 10% of the population and we have a vaccine that will immunize your.

    2. There is a flu that will infect 20% of the population, but we have a vaccine that will only successfully immunize 50% of the people that get it.

    3. There are two flu strains out there, type A and type B. Each flu strain as a probability for each to infect 10% of the population. There is unfortunately only a vaccine for strain A.

    He asks the "decision maker" what value they apply to the vaccine in each case. People instinctively pay the most for case 1, less for case 2, and the least for case 3. But this is irrational. A proper decision maker will see that the vaccine has the same final effect in each case, i.e. it will protect 10% of the population, so the value of the vaccine should be the same.

  • He talks about how decisions fail to meet the paradigm of identifying outcomes and probabilities because people ignore outcomes in favour of thinking about gains and losses. He talks about how people "frame" their valuations. He sets up two similar situations but one in which a person is to "sell" a mug versus "buying" the same mug. Typically you value something you sell as twice as much as something you buy. As he points out "giving up" something is painful. This is an example of "loss aversion".

No comments: