book reviews, articles at the good, the true, the beautiful book review article

 

My Home Page with Links to My Other Book Reviews

 

The Limits of Safety

by Scott D. Sagan

 

When Scott D. Sagan began writing The Limits of Safety, he was confident in superpower control of nuclear weapons. He is less confident now.

 

Sagan describes spirals of the unexpected. When it comes to nuclear wars, all it takes is one error or series of errors. During the Cuban missile crises, a guard in Minnesota saw a saboteur and shot the saboteur. Sabotage alarms blared at other bases, but at an air force base in Wisconsin, the wrong alarm went off--the alarm signaling all out nuclear war. Scrambled planes were stopped by an officer who drove his car on the runway. (They could have been called back once in the air.) The so-called saboteur: A bear. Studying the history of U.S. nuclear forces, Sagan was shocked by the number of near misses and cover-ups.

 

Sagan argues that in the United States, and probably Russia, overconfidence reigns. The lack of prior inadvertent nuclear wars has produced unwarranted complacency. The Department of Defense claimed in 1980 that "there is no chance" that ambiguous computer info could lead to war.

 

(Pretending probabilities are zero or 100 percent seems to be common among politicians and military leaders.)

 

Sagan cites numerous examples of gross safety errors. In 1963 a bomber pilot accidentally turned on one of two switches to arm a nuclear weapon. Fearing punishment for his mistake, he deliberately flipped the second switch to arm the weapon in the hope that he could plausibly convince his superiors that a saboteur armed the weapons. American aircraft inadvertently dropped several unarmed nuclear weapons. In 1962 the crew of a B-52 made a navigational error, and was within 300 miles of Soviet airspace, when discovered by Americans and ordered to change course.

 

During the Cuban missile crises, a test ICBM launched without high-level permission. JFK put it thus: "There is always some son-of-a-bitch who doesn't get the word."

 

Nor are these events limited to 30 or 40 years ago.

In 1995 (after this book was printed) the Soviets mistook a Norwegian weather rocket for an incoming ICBM.

 

Sagan examines two competing safety theories, the normal accidents theory and the high reliability theory. The users of high reliability theory posit that systems can be made safe by using redundant systems, emphasizing safety. They decentralize authority so that those closest to a problem can make flexible decisions. They recruit disciplined, well trained members. They learn through simulation and trial-and-error. (Nations use high reliability theory to manage nuclear weapons.)

 

An example of a redundant system: If you are prone to locking your keys in your car, a redundant system keeps a spare in your wallet and another spare hidden under the car. Redundant systems are not as safe as they appear. Locks jam. Keys get lost or not put back. Magnetic key holders fall off. When systems contain thousands of parts, multitudes of potential incidents exist. What individuals think are minuscule odds are actually much higher.

 

The normal accidents theory challenges the high reliability view, claiming that accidents eventually happen. Decentralization that allows flexibility also allows decentralized individuals to do vile things without someone stopping them. Decentralization is good in business. It may be an acceptable cost of decentralization for a business when one crook swindles $15,000. But in a nuclear system one nuclear war is not an acceptable cost.

 

Training and discipline do not make individuals 100 percent predictable. Individuals with serious psychological or character flaws cannot all be weeded out. Numerous elite members o armed forces committed bizarre murders and suicides in the past.

 

Simulation and trial-and-error have biases, especially in large cost, authoritarian environments. Errors get covered up. Scapegoats take the blame when poorly designed systems are also at fault. History gets reconstructed to serve the interests of the organization or its leaders. Unpredicted or dangerous situations create training difficulties.

 

Trial-and-error matters, but with nuclear weapons one catastrophic error is a sample of one too many.

Learning is difficult in tight, disciplined organizations. Politics, secrecy and threatened egos impede learning. Sagan recounts a cover-up incident: A Delta airliner strayed within 100 feet of a Continental 747. The Delta pilot said, "Nobody knows about it except us, you idiots." The Continental pilot answered, "I have passengers pounding on the door, and crying, and they saw the whole thing out the windows."

 

When industrial accidents occur, bosses hunt for operator errors until they find them or invent them. Individuals and isolated institutions have goals at odds with the goals of those they allegedly serve.

 

Redundancy often fails. Redundancy creates complexities and encourages individuals to take unjustified risks in the belief plenty of redundancy to save them still exists. The Chernobyl disaster occurred during a safety system test. The operators at Chernobyl violated four rules together--and all necessary for the disaster to happen. If someone asked a Soviet nuclear expert in 1985 whether the sequence of events leading to Chernobyl could happen, the expert might have laughed. The Fermi nuclear reactor in Michigan nearly melted down in 1966 because of a failed safety device.

 

Sagan reports mostly on narrow escapes in America. Worse may have occurred elsewhere, especially in nations with dictator controlled media. Nuclear weapons continue to spread to poor nations. Where money is tight, low safety rules. The nuclear weapons program found in Iraq after the Gulf War featured designs that might detonate if they fell off a desk.

 

The high reliability view, Sagan writes, sees the glass as 99 percent full. The normal accidents theorists see it as one percent empty--and when it comes to nuclear war, one percent is thousands of times too high.

 

The high reliability theory is fine in the right situations—low probability accidents that cause only a few deaths.

 

Sagan argues that humans learn little from constant success, yet something could be more dangerous: "the resourcefulness with which committed individuals and organizations can turn the experience of failure into the memory of success." The history of weapons accidents was written largely by those wishing to downplay mistakes. Sagan writes that the burden of proof rests with the managers of nuclear weapons, and their safety record is atrocious.

 

Many believe a world with several nuclear powers is stable. Sagan calls this view wrong. Nuclear danger is proportional to the number of weapons and the number of nations that possess them. Wars are not always started by national leaders acting in what they think is self-interest. Wars often start by accident and by individuals lower in hierarchies. Leaders "self-interests" often have little to do with survival. Some love destruction. Some want a place in history, no matter how infamous. Some want revenge for real or perceived slights. Some think murdering others leads to heaven. A huge list of evil motives conflicts with self-interest. Others argue for total nuclear disarmament, but Sagan claims nuclear weapons are too small to be accurately verified, and if a war broke out, advanced nations could quickly build nuclear weapons.

 

The Limits of Safety recommends more studies of past problems and more independent reviews of nuclear safety. It supports greater sharing of safety findings among superpowers.

 

"[U]nanticipated interactions tend to occur in high technology production systems in which many feed-back loops exist, and in which dangerous components are in close proximity to one another... tightly coupled systems tend to have plans for very rapid reactions." Sagan therefore recommends that aircraft carrying nuclear weapons never fly above national warning systems. Nuclear warheads and missile testing facilities must be separated. Warning radar and ICBMs must not share the same sites. Radio controlled devices that would destroy missiles in flight and limited ballistic missile defenses to destroy accidental launches are also excellent ideas offered by Sagan.

 

Military forces face the "always/never" problem. Always be ready. Never make a major mistake. Fortunately, this book has few mistakes. Highly recommended.

 

Book review article by J.T. Fournier, last updated July 24, 2009.

 

My Home Page with Links to My Other Book Reviews