A nuclear disaster could unfold like that Oscars fiasco

By FAYE FLAM

Talk with experts on North Korea’s nuclear arsenal, and it soon becomes clear that the biggest threat the US face isn’t an intentional act of evil, but a confluence of stupidity and error. After all, the most frightening close calls during the Cold War started with trivial mistakes — a dropped socket from a socket wrench, for example, or a training tape put in the wrong computer.

With nine missile tests just this year, North Korea is quickly advancing the range of its nuclear weapons. The distance record goes to a missile called Hwasong-12, which was launched on May 14. It travelled about 500 miles, but on a steep trajectory that demonstrated the power to have gone more than 2,400 miles.

Some experts, such as Jeffrey Lewis of the James Martin Centre for Non-proliferation Studies in Monterey, California, say that missile had an innovative design and showed evidence of real engineering competence, while others, such as German aerospace engineer Markus Schiller, aren’t so sure. Experts also disagree about how close North Korea is to being able to strike San Francisco or Washington, or whether the US should negotiate a deal to prevent this from happening. But they do tend to agree that the country’s leader, Kim Jong-un, is unlikely to launch an unprovoked attack, since the retaliation would obliterate his country.

The situation is, nevertheless, dangerous, given the possibility of error and misjudgment. “I cannot imagine any circumstance that would lead Kim Jong-un to launch an unprovoked nuclear attack on anyone,” Stanford physicist Siegfried Hecker said in a recent interview published in the Bulletin of Atomic Scientists. But, he went on, “We can’t rule out a miscalculation or a desperate response to a crisis.”

Physicist David Wright, co-director of the Global Security Programme at the Union of Concerned Scientists, has made a similar point: “The biggest threat here seems to be that you’d get to the point where you’d have a crisis — where people do things and other people misunderstand their intentions”.

Former Rand Corp nuclear strategist Daniel Ellsberg has spent years thinking about how to avoid this sort of situation. While he’s best-known for leaking the Vietnam War documents known as the Pentagon Papers, he says his current focus is preventing nuclear war. His book “The Doomsday Machine: Confessions of a Nuclear War Planner” will be released in December.

I met him at an informal lunchtime meeting at the home of physicist Ted Postol, a professor at MIT’s programme for Science, Technology and International Security. Ellsberg had a wealth of information about various kinds of crises. He’d previously studied the cascade of mistakes that led to the sinking of the Titanic. But now, history had given him a more recent example: The 2017 Academy Awards fiasco, in which a series of errors led presenter Faye Dunaway to mistakenly announce that the musical “La La Land” had won best picture.

In that debacle, as in others, a system was set up with overconfidence, and when something previously deemed unlikely went wrong, those responsible failed to react with any reasonable haste.

Consider the chain of events at the Oscars: PricewaterhouseCoopers (PwC), the accounting firm in charge of giving out the envelopes, made two copies of all the cards holding names of the winners, so employees on either side of the stage carried a complete set. In a Huffington Post interview shortly before the event, PwC employee Brian Cullinan explained how “unlikely” a mistake would be; he and his colleague Martha Ruiz stressed that no such error had ever occurred in the history of the Oscars. But when it came time to announce the award for best picture, despite the strict protocol intended to ensure accuracy, Cullinan accidentally handed presenter Warren Beatty the duplicate envelope for the best actress award.

Similarly, the odds were low back in 1980 that an airman working at a missile base in Damascus, Arkansas, would drop a socket from a wrench at just the wrong time so that it fell 80ft and pierced the engine of a Titan II nuclear missile. A subsequent explosion killed another airman and hurled a nuclear warhead out of its underground bunker; it landed on a nearby roadside. Fortunately, it didn’t detonate.

That close call formed the centre- piece of Eric Schlosser’s book “Command and Control”, recently made into a documentary. Schlosser summarised other close calls and present dangers in a New Yorker story appearing earlier this year. We’re still here because people with limited time and information made good decisions, but, as Ellsberg observed, the Academy Awards fiasco shows how easily it could go otherwise.

The Oscars mix-up also highlighted the element of time, he said. The problem wasn’t just that the wrong movie was announced for best picture, but that it took the people who screwed up so long to do anything about it. The second producer for the wrong movie, “La La Land”, was already starting his celebratory speech before anyone onstage was alerted that something was wrong.

People aren’t necessarily good at doing the right thing in short windows of time. But the US and Russian nuclear arsenals are set up so that world leaders may have as little as six minutes to decide whether a threat is real and launch a retaliatory strike. And false alarms are a real threat. In 1979, for example, someone put a training tape simulating a Soviet attack in the wrong computer at the US Strategic Air Command’s headquarters in Colorado. Missile crews in the Midwest were alerted of a massive Soviet attack, and had minutes to decide whether to assume it was real and attempt to minimise the damage with retaliatory strikes.

Similarly, in 1983, the Soviets got a signal from their satellites that the US had aimed five nuclear missiles their way. It turned out to be the result of sun reflecting off clouds. And in 1995, the Russian early-detection system was mistakenly triggered again — this time by a science experiment launched from an island off Norway.

Add the tension of a critical situation, and people are likely to assume the worst. During the Cuban Missile Crisis, for example, the commander of a Soviet submarine thought a nuclear war had begun, and had ordered his crew to fire a nuclear-tipped torpedo. He was talked out of it just in time by another officer.

Maybe Brian Cullinan was right that the Oscars mistake was unlikely. The insidious thing about these sorts of scenarios is that while each one is astronomically unlikely, there are so many potential crisis scenarios. There are no smart ways to start a nuclear war, but there are an infinite variety of stupid ones. — Bloomberg

  • This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.