Good intentions don’t justify lying about risk
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)
DO NOT LIE ABOUT RISK: PRESENT PROBABILITIES TRUTHFULLY OR NOT AT ALL
On July 7th 2005, four bags containing bombs were left on London public transport. They exploded, killing 52 people. Bombs in two bags at the Boston Marathon this year killed three.
We can imagine the policy-maker’s thinking when they came up with this campaign. “If we had reports of all the suspicious bags, we’d be able to stop some of these bombs from going off. But when people see a left bag, they probably think it’s nothing, and so they don’t report it. So, let’s just lie. That way, they’ll be so scared that they’ll report every bag. We’ll stop some bombs. And that justifies the lying. Here how about this?”:
If it doesn’t feel right, it probably isn’t.
This reminds of of conversations we have had about mortgages. To prevent another meltdown, some policy-makers suggest exaggerating risks of foreclosure to scare people into choosing less expensive houses.
At Decision Science News, we are all for getting people to report suspicious packages, to choose safer mortgages, to exercise and eat well to safeguard their health, but we are dead set against mis-reporting probabilites to scare people into action.
Don’t say something will probably happen when it won’t.
Let’s look at London, in an example from Michael Blastland and David Spiegelhalter’s book The Norm Chronicles: Stories and Numbers About Danger
Since the London attacks, some 250,000 bags have been left on London Transport.
None have turned out to be bombs.
If we define a “present era” as eight years before and after 7/7, the probability of a left bag on London Transport being a bomb is something like 4 in 500,000 or 1 in 125,000. If we consider all the bags left in all the large metropolises of Europe and North America over a decade, we’re talking about microscopic chances.
No trying to be fresh here, but despite what the very well-intentioned New Jersey officials are saying above, which implies a greater than 50% probability of foul play, it’s more the case that:
If it doesn’t feel right, there’s a 0.000008 chance it isn’t.
Again, we’re all for reporting every suspicious bag. Perhaps 7/7 and Boston events could have been prevented. And we’re not saying that “Suspicious bags: Almost certainly safe” is how the poster should read. We’re just saying that there are many ways to bring about desirable behaviors that don’t involve fibbing. For example: Make it easier to report suspicious bags, provide an email address, appeal to reason, remind people of how awful it is when bombs to go off, emphasize how many things could have been prevented if everyone reported everything, and so on.
And we’re not against scaring people. Emotions drive decisions, and getting people to think about possible consequences can improve long-run decision making. But you can present truthful, graphic depictions of outcomes without lying about the probabilities of these outcomes. It’s the product of the probability and the outcome that counts.
To this, you might say “Oh, Decision Science News, don’t you realize that people are irrational, biased, myopic, self-serving, probability-neglecting innumerates who won’t do the right thing unless you make up stories to scare them?”. To this we say “No. First, assembling an ever-expanding list of so-called biases is not science. Science is proposing and testing models of the larger system that predicts when these effects appear, disappear, and invert. Second, the Santa Claus approach of lying to bring about good behavior is not only dishonest but self-defeating. People will quickly learn not to trust you and will ignore all your posters and warnings.”
Improved risk literacy will help people lead happier, healthier and safer lives. But for people to become risk literate, they need accurate risk estimates, not phony probabilities that cry wolf.