Calculating calamity: Japan's nuclear accident and the "antifragile" alternative
Posted by Big Gav in black swan, japan, nuclear power
Kurt Cobb at Resource Insights has a look at the probability of nuclear accidents in light of the "black swan" view of statistical analysis - Calculating calamity: Japan's nuclear accident and the "antifragile" alternative.
Famed student of risk and probability and author of The Black Swan Nassim Nicholas Taleb tells us that in 2003 Japan's nuclear safety agency set as a goal that fatalities resulting from radiation exposure to civilians living near any nuclear installation in Japan should be no more than one every million years. Eight years after that goal was adopted, it looks like it will be exceeded and perhaps by quite a bit, especially now that radiation is showing up in food and water near the stricken Fukushima Dai-ichi plant. (Keep in mind that "fatalities" refers not just to immediate deaths but also to excess cancer deaths due to radiation exposure which can take years and even decades to show up.)
Taleb writes that it is irresponsible to ask people to rely on the calculation of small probabilities for man-made systems since these probabilities are almost impossible to calculate with any accuracy. (To read his reasoning, see entry 142 on the notebook section of his website entitled "Time to understand a few facts about small probabilities [criminal stupidity of statistical science].") Natural systems that have operated for eons may more easily lend themselves to the calculation of such probabilities. But man-made systems have a relatively short history to draw from, especially the nuclear infrastructure which is no more than 60 years old. Calculations for man-made systems that result in incidents occurring every million years should be dismissed on their face as useless.
Furthermore, he notes, models used to calculate such risk tend to underestimate small probabilities. What's worse, the consequences are almost always wildly underestimated as well. Beyond this, if people are told that a harmful event has a small chance of happening, say, 1 in a 1,000, they tend to dismiss it, even if that event might have severe consequences. This is because they don't understand that risk is the product of probability times severity.
If the worst that walking across your room could do is cause a bruise from falling, you wouldn't think much about it. Even if the chance of getting a bruise were significant, you'd probably be careful and figure it's worth the risk. But if walking across your room subjected you to the possibility of losing your arm, you might contemplate your next move a bit more.
But, the point Taleb makes is that the people of Japan did not know they were subjecting themselves to this severe a risk. If they had, they might have prepared for it or they might have even rejected nuclear power altogether in favor of other energy sources. But, both the probability and severity of this event were outside the models the regulatory agencies used. This is one of the major reasons we often underestimate risk and severity. But even if such an event had been included, the consequences would most likely have been considerably underestimated.