Reviewing Eric Schlosser's excellent book, Command and Control, Arthur Herman writes in the Wall Street Journal that Mr. Schlosser's Jeremiad is based on an irrational fear which is belied by the evidence at hand.
Herman asks: If having 10,000 nuclear bombs on missiles, in airplanes and in submarines is so dangerous, then why have we not have more accidents?
Mr. Herman finds the very fact that there was only one case where a nuclear bomb blew up in a silo (in Arkansas) and the nuclear part of the bomb did not detonate-- because safety systems worked--very reassuring. Having large number of bombs, any one of which could level a state and contaminate the entire East Coast in a way which would make Fukishima look like a garden party, is no problem for Mr. Herman.
Mr. Herman finds the very fact that there was only one case where a nuclear bomb blew up in a silo (in Arkansas) and the nuclear part of the bomb did not detonate-- because safety systems worked--very reassuring. Having large number of bombs, any one of which could level a state and contaminate the entire East Coast in a way which would make Fukishima look like a garden party, is no problem for Mr. Herman.
Reading this review, from a conservative professor (a fellow at the American Enterprise Institute) in a conservative newspaper (WSJ) says all that needs to be said about the capacity of the human mind to use the mechanism of denial to achieve a sense of comfort and to validate dearly held beliefs.
For any normal human being reading this book, with its long list of accidents, and narrowly avoided catastrophes, this catalog of near catastrophes is a cautionary tale--this would be a cautionary tale, for anyone not welded to an ideology, an ideology which says private enterprise and big corporations feeding on the government teat are--not just works of man but works of God.
For anyone who works in medicine, or in any of a variety of technical or engineering realms, the experience of things going wrong is very familiar. Things go wrong on the ward, doing procedures, doing surgery, all the time. It's part of the experience of working with mechanical things. Things happen.
In the case of surgery, you inadvertently nick an artery and you put your finger on the punctured artery, hold it, maybe throw in a suture to stop the bleeding, and you move on. In the case of nuclear bomb, not so easy. Things go wrong there, it's not so easy; the consequences are large. The chance to correct, to retrieve, to adjust is simply on a different level, when it comes to a bomb which can level the state of Arkansas. In the case of the Arkansas missile, they knicked a missile fuel tank and they put the entire state of Arkansas at risk and there was precious little they could do to avert catastrophe. They quite literally flew on a wing and a prayer.
For anyone who works in medicine, or in any of a variety of technical or engineering realms, the experience of things going wrong is very familiar. Things go wrong on the ward, doing procedures, doing surgery, all the time. It's part of the experience of working with mechanical things. Things happen.
In the case of surgery, you inadvertently nick an artery and you put your finger on the punctured artery, hold it, maybe throw in a suture to stop the bleeding, and you move on. In the case of nuclear bomb, not so easy. Things go wrong there, it's not so easy; the consequences are large. The chance to correct, to retrieve, to adjust is simply on a different level, when it comes to a bomb which can level the state of Arkansas. In the case of the Arkansas missile, they knicked a missile fuel tank and they put the entire state of Arkansas at risk and there was precious little they could do to avert catastrophe. They quite literally flew on a wing and a prayer.
For the Wall Street Journal, for Professor Herman, this book is a screed, a polemic and a call for fuzzy minded liberal efforts to reduce the number of nuclear bombs around the planet, and that would be bad for business.
From the point of view of an engineer or a surgeon, this is a simple document of numbers: You have a certain number of bombs and things go wrong with those bombs at a certain rate, an incidence, and if you multiply out risks and incidence, you get a certain rate of occurrence. We haven't reached that occurrence quite yet, but it's coming.
Any one can see this. Only the willfully blind cannot.
From the point of view of an engineer or a surgeon, this is a simple document of numbers: You have a certain number of bombs and things go wrong with those bombs at a certain rate, an incidence, and if you multiply out risks and incidence, you get a certain rate of occurrence. We haven't reached that occurrence quite yet, but it's coming.
Any one can see this. Only the willfully blind cannot.













