Command and Control by Eric Schlosser is an entertaining and terrifying history of the development of the control and safety (or lack thereof) of America's nuclear weapons throughout the Cold War.
My dog-eared sections, to give you a flavour...
Page 196
The usefulness of the Super [the first hydrogen bomb] wasn't the issue; the willingness to build it was. And that sort of logic would guide the nuclear arms race for the next forty years.
Page 223
On January 23, 1956, President Eisenhower recorded in his diary the results of a top secret study on what would really happen after a Soviet attack:
The United States experienced practically total economic collapse, which could not be restored to any kind of operative conditions under six months to a year. . . . Members of the Federal government were wiped out and a new government had to be improvised by the states. . . . It was calculated that something on the order of 65% of the population would require some sort of medical care, and in most instances, no opportunity whatsoever to get it.
Page 375
The BMEWS [Ballistic Missile Early Warning System] site at Thule had mistakenly identified the moon, slowly rising over Norway, as dozens of long-range missiles launched from Siberia.
Page 455
Half an hour later, a Missile Potential Hazard Team ordered them to reenter the silo. They found it full of thick, gray smoke. One of the retrorockets atop the Minuteman had fired. The reentry vehicle, containing a W-56 thermonuclear weapon, had lifted a few inches into the air, flipped over, fallen nose first from the misslie, bounced off the wall, hit the second-stage engine, and landed at the bottom of the silo.
Page 530
As the minutes passed without the arrival of Soviet warheads, it became clear that the United States wasn't under attack. The cause of the false alarm was soon discovered. A technician had put the wrong tape into one of NORAD's computers. The tape was part of a training exercise — a war game that simulated a Soviet attack on the United States. The computer had transmitted realistic details of the war game to SAC headquarters, the Pentagon, and Site R.
Page 533
NORAD had dedicated lines that connected the computers inside Cheyenne Mountain to their counterparts at SAC headquarters, the Pentagon and Site R. Day and night, NORAD sent test messages to ensure that those lines were working. The test message was a warning of a missile attack — with zeros always inserted in the space showing the number of missiles that had been launched. The faulty computer chip had randomly put the number 2 in that space, suggesting that 2 missiles, 220 missiles, or 2,200 missiles had been launched. The defective chip was replaced, at a cost of forty-six cents. And a new test message was written for NORAD's dedicated lines. It did not mention any missiles.
Page 537
And as a final act of defiance, SAC demonstrated the importance of code management to the usefulness of any coded [safety] switch. The combination necessary to launch the missiles was the same at every Minuteman site: 00000000.
Page 640
An investigation later found that the missile launches spotted by the Soviet satellite were actually rays of sunlight reflected off clouds.
Page 642
When Minuteman missiles first appear above Kansas, launched from rural silos there and rising in the sky, the film conveyed the mundane terror of nuclear war, the knowledge that annihilation could come at any time, in the midst of an otherwise ordinary day. People look up, see the missiles departing, realize what's about to happen, and yet are powerless to stop it. About 100 million Americans watched The Day After, roughly half of the adult population of the United States.
Page 656
After studying a wide range of "trivial events in nontrivial systems," Perrow concluded that human error wasn't responsible for these accidents. The real problem lay deeply embedded within the technological systems, and it was impossible to solve: "Our ability to organize does not match the inherent hazards of some of our organized activities." What appeared to be the rare exception, an anomaly, a one-in-a-million accident, was actually to be expected. It was normal.
Page 657
When a problem arose on an assembly line, you could stop the line until a solution was found. But in a tightly coupled system, many things occurred simultaneously — and they could prove difficult to stop. If those things also interacted with each other, it might be hard to know exactly what was happening when a problem arose, let alone know what to do about it. The complexity of such a system was bound to bring surprises. "No one dreamed that when X failed, Y would also be out of order," Perrow gave as an example, "and the two failures would interact so as to both start a fire and silence the fire alarm."
Page 661
The nuclear weapon systems that Bob Peurifoy, Bill Stevens, and Stan Spray struggled to make safer were also tightly coupled, interactive, and complex. They were prone to "common-mode failures" — one problem could swiftly lead to many others. The steady application of high temperatures to the surface of a Mark 28 bomb could disable its safety mechanisms, arm it, and then set it off. "Fixes, including safety devices, sometimes create new accidents," Charles Perrow warned, "and quite often merely allow those in charge to run the system faster, or in worse weather, or with bigger explosives."
Page 670
The only weapons in today's stockpile that trouble Peurifoy are the W-76 and W-88 warheads carried by submarine-launched Trident II missiles. The Drell panel expressed concern about these warheads more than twenty years ago.
Page 685
High-risk technologies are easily transferred across borders; but the organizational skills and safety culture necessary to manage them are more difficult to share. Nuclear weapons have gained allure as a symbol of power and a source of national pride. They also pose a grave threat to any country that possesses them.