Saturday, December 10, 2011

Rational Thinking

Here is an example of using bayesian reasoning to wriggle between the extremes of ignoring a possible problem and giving oneself over to despair. This is from Tim Harford's blog in which he has allowed "Sophy" to post:
You’re a woman in her early fifties. You’re invited to a breast cancer screening unit, and you go along hoping for the all-clear. After all, 99 per cent of women your age do not have breast cancer. But … the scan is positive. The screening process catches 85 per cent of cancers. There is a chance of a false alarm, though: for 10 per cent of healthy women, the screening process wrongly points to cancer. What are the chances that you have breast cancer?

Over 50,000 British women face this awful question each year. I first encountered it – in a less alarming context – as an undergraduate economist. And I was in the audience recently when David Spiegelhalter used it as an example in his Simonyi Lecture, “Working Out the Odds (With the Help of the Reverend Bayes)”. The numbers approximately reflect the odds faced by women who go for breast cancer screening. And the answer – courtesy of the Reverend Bayes in question, who died 250 years ago – is surprising.

Bayes was concerned with how we should understand the notion of “probability”, and how we should update our beliefs in light of new information.

A Bayesian perspective on the apparently grim screening result tells us that things are not as bad as they seem. The two key pieces of information point in different directions. On the one hand, the positive scan substantially worsens the odds that you have cancer. But on the other, the odds are worsening from an extremely favourable starting point: 99 to 1 against. Even after the positive scan, you still probably don’t have cancer.

Imagine 1,000 women in your situation: 990 do not have cancer, which means we can expect 99 false positives, far more than the 10 women who do have cancer. This is why any apparent sign of cancer should be followed up with further tests in the hope of avoiding unnecessary treatments. The chance that you have cancer is 9 per cent – up dramatically from 1 per cent, but with plenty of room for optimism.

None of this proves screening is pointless. It can save lives, but it raises dilemmas. The UK’s breast cancer screening programme is currently under review. A systematic analysis published by the Cochrane Collaboration found that for every woman who had her life extended by early detection and treatment, there would be 10 courses of unnecessary treatment in healthy women, and more than 200 women would experience distress as the result of a false positive.

Bayesian reasoning has implications far beyond cancer screening, and we are not natural Bayesians. Daniel Kahneman, a psychologist who won the Nobel memorial prize in economics, discusses the issue in a new book, Thinking, Fast and Slow. I recently had the opportunity to quiz him in front of an audience at the Royal Institution in London. Kahneman argues that we often ignore baseline information unless it can somehow be made emotionally salient. New information – “possible cancer” – tends to monopolise our attention.

Another example: if somebody reads the Financial Times, should you conclude that they are more likely to be a quantitative analyst in an investment bank, or a public sector worker? Before you leap to conclusions, remember that there are six million public sector workers in the country. Base rates matter.

Sometimes there is no objective base rate and we must use our own judgment instead. I think homeopathy is absurd on theoretical grounds; others find it intrinsically plausible. Bayesian analysis tells us how to combine those prior beliefs – or prejudices – with whatever new evidence may come along.

Whenever you receive a piece of news that challenges your expectations, it’s tempting either to conclude that everything has changed – or that nothing has. Bayes taught us that there’s a rational path between those two extremes.
The value of mathematics, or logic, and rational thinking is that it gives you tools to work through the muddle of real life. Even models can be useful to provide a guide in a murky area. But the trick is to always realize that these tools are idealizations. The actual world isn't mathematical, logical, rational, or capturable in a model.

No comments: