Early on in the Manhattan Project, the scientists taking part knew that they were pursuing a weapon that could give humankind the unprecedented ability to destroy itself. What they didn't know, however, was how this destruction might occur.
In 1942, Hungarian-American physicist Edward Teller, known now as "the father of the hydrogen bomb," entertained a devastating nightmare scenario: that an atomic bomb could ignite the atmosphere and the oceans. He reasoned that a nuclear fission bomb might create temperatures so extreme that it would cause the hydrogen atoms in the air and water to fuse together into helium, just like in our sun, generating a runaway reaction that would eventually engulf the globe, extinguishing all life and turning the Earth into a miniature star.
When Teller informed some of his colleagues of this possibility, he was greeted with both skepticism and fear. Hans Bethe immediately dismissed the idea, but according to author Pearl Buck, Nobel Prize-winning physicist Arthur Compton was so concerned that he told Robert Oppenheimer that if there were even the slightest chance of this "ultimate catastrophe" playing out, all work on the bomb should stop.
So a study was commissioned to explore the matter in detail, and six months before the Trinity test, the very first detonation of nuclear device, Edward Teller and Emil Konopinski announced their findings in a report with the ominous title "Ignition of the Atmosphere With Nuclear Bombs."
"It is shown that, whatever the temperature to which a section of the atmosphere may be heated, no self-propagating chain of nuclear reactions is likely to be started. The energy losses to radiation always overcompensate the gains due to the reactions."
As we've thankfully witnessed after more than 2,000 nuclear detonations, Teller and Konopinski's conclusion appears to be correct.