How Are Gliders Like Nuclear Deterrence?

In the years since the Cold War, the threat of imminent global thermonuclear war has receded in the popular imagination. Computer hackers are buying up abandoned missile silos. It’s been almost a decade since a major Hollywood film revolved around a U.S.-Russian nuclear exchange. But that doesn’t mean deterrence has succeeded in finally staving off nuclear war. Stanford University Professor Emeritus Martin Hellman, comparing his love of gliders with his interest in nuclear deterrence, wants to remind you that when a system is 99.9 percent safe but the remaining 0.1 percent contains an absolutely catastrophic outcome, it’s not a great system. Sound familiar? [%comments]


Example #3 - birth control

Eric M. Jones

I submit that saying something is "99.9 percent safe" requires definition.


I think this is flawed in the sense that the trajectory of the glider is within the pilot's control and the 0.1% leading to his death is outside his control.

Contrast that with the catastrophic failure of nuclear deterrence. The failure still lies largely within SOMEBODY'S control, presumably someone on the other side of the world who doesn't necessarily want to see the end of society as they know it. Yes, there are factors that can't be controlled, but many factors that result in nuclear Armageddon are controllable by someone. This is a human behavior problem, not an engineering problem.

An analogy would be the earlier post about joining a gang. Let's say I take their test, it is found to be valid and reliable, and it predicts I have a 90% chance of joining a gang. Should I then rely on that myself and start buying appropriately-colored clothing in preparation for my impending gang life, even if I have no desires to be in a gang? That would be absurd.



The flip side of the problem, though, is that allowing one country to have a monopoly in nuclear weapons (something that cannot be prevented without draconian measures that would never be allowed in the current world) creates a worse outcome overall.

Yes, worse than global thermonuclear war.

If you manage to create a "nuclear free world," you end up with a massive incentive for one or another country to create a nuclear monopoly - and after that, a nuclear hegemony. Which ends up with one person (or group of people) holding the power to erase other folks who don't agree with them. And (of course) that power ends up being used. A lot.


That puts just about everything on the planet as not a great system . . . so what?

Martin Hellman

Thanks very much for picking up on my article. There's also a YouTube video that Google made when I gave a related talk there, but be warned -- it was a typical hour long seminar, so that's the length of the video. If you Google search on "youtube hellman nuclear risk" (without the quotes), it should be the top hit. And thanks for the link to Joe Nocera's January article which is very relevant.

Martin Hellman

R. Bruesewitz

I believe there are serious problems with this model. Actors in an international crisis have incentive to bluff until the very brink of nuclear war. These type of 'near-misses' are often described as in game theory as 'chicken'. As the game is continued the risk become ever greater, until the last second when both player have incentive to steer away avoiding a crash.

On a related note to #5 it is quite likely that nuclear weapons prevented a non-nuclear third world war. Without nuclear weapons it is quite likely that there would have been a massive conventional war between NATO and the USSR, this war would have surely caused millions of deaths and cost trillions of dollars.

I don't think that more nuclear weapons should be built nor do I believe that they are acceptably safe. Given the alternatives what other options are there?

"Alas! can we ring the bells backward? Can we unlearn the arts that pretend to civilize, and then burn the world? There is a march of science; but who shall beat the drums for its retreat?"

Charles Lamb (1775-1834)


Kenny Love

Yes, indeed, this does sound familiar, and is best equated with the age old axiom that references, "A chain is only as strong as its weakest link."



I agree with #5, Traciatim - reality kinda limits us to less than 100%, so too bad.


ahh pot odds and Taleb...yes


Problem with the 1%, or even .1%, is that the outcome is not actually catastrophic and people tend to round .000000000000000000000001 up to 1%.