Our Use Of Nuclear Weapons 76 Years Ago Was A Moral And Strategic Imperative Henry I. Miller
Americans are no strangers to times that “try men’s souls,” to borrow a phrase from Thomas Paine. By mid-1945, we had been at war for three-and-a-half years, enduring the draft, mounting numbers of casualties, and rationing, with no end in sight. Many Americans were weary, not unlike our feelings now, after a year-and-a-half a year of privations and anguish related to the COVID-19 pandemic.
That sense of anxiety got me thinking about how WWII was suddenly – and to many, unexpectedly – resolved. Today marks one of the United States’ most important anniversaries, memorable not only for what happened on this date in 1945 but for what did not happen.
What did happen was that the Enola Gay, an American B-29 Superfortress bomber, dropped Little Boy, a uranium-based atomic bomb, on the Japanese city of Hiroshima. That historic act hastened the end of World War II, which concluded within a week, after the Aug. 9 detonation of Fat Man, a plutonium-based bomb, over Nagasaki. These were the only two nuclear weapons ever used in warfare.
I have two peripheral connections to those events. The first is that when Little Boy was dropped on Hiroshima, my father, a sergeant in the U.S. Army infantry who had fought in the Italian campaigns of WWII, was on a troopship, expecting to be deployed to the Pacific theater of operations. Neither he nor his fellow soldiers relished the prospect of participating in the impending invasion of the Japanese main islands. When the Japanese surrendered (on Aug. 14), the ship headed, instead, for Virginia, where the division was disbanded. (I was born two years later.)
My second connection was that during the 1960s, three of my M.I.T. physics professors had participated several decades earlier in the Manhattan Project, the military research program which developed the atomic bombs during the war. In class, one of these professors recalled that, after the first test explosion (code-named Trinity), he was assigned to drive Maj. Gen. Leslie Groves, the director of the project, to view the result. They arrived to find a crater 1,000 feet in diameter, and six feet deep, with the desert sand inside turned into glass by the intense heat. Groves’ response? “Is that all?”
Approximately 66,000 are thought to have died in Hiroshima from the acute effects of the Little Boy bomb, and about 39,000 in Nagasaki from the Fat Man device. In addition, there was a significant subsequent death toll due to the effects of radiation and wounds.
Shortly thereafter, the questions began: “Was it really necessary?” The Monday-morning quarterbacks started to question the morality and military necessity of using nuclear weapons on Japanese cities. Even nuclear physicist Leo Szilard, who, in 1939, had written the letter for Albert Einstein‘s signature that resulted in the formation of the Manhattan Project, characterized the use of the bombs as “one of the greatest blunders of history.” Since then, there have been similar periodic eruptions of revisionism, uninformed speculation, and political correctness.
The historical context and military realities of 1945 are often forgotten when judging whether it was “necessary” for the United States to use nuclear weapons. The Japanese had been the aggressors, launching the war with a sneak attack on Pearl Harbor in 1941, and systematically and flagrantly violating various international agreements and norms by employing biological and chemical warfare, the torture and murder of prisoners of war, and the brutalization of civilians, including forcing them into prostitution and slave labor.
Leaving aside whether our enemy “deserved” to be attacked with the most fearsome weapons ever employed, skeptics are also quick to overlook the “humanitarian” and strategic aspects of the decision to use them.
Operation Downfall Meets A Fork In The Road
As a result of the bombing of Hiroshima and Nagasaki, what did not need to happen was “Operation Downfall” – the massive Allied (largely American) invasion of the Japanese home islands that was being actively planned. As Allied forces closed in on the main islands, the strategies of Japan’s senior military leaders ranged from “fighting to the last man” to inflicting heavy enough losses on invading American ground forces that the United States would be forced to agree to a conditional peace. Operation Downfall was designed, in large part, because U.S. strategists knew (from having broken the Japanese military and diplomatic codes) that there was virtually no inclination on the part of the Japanese to surrender unconditionally.
Lastly, because the Allied military planners assumed that “operations in this area will be opposed not only by the available organized military forces of the Empire [of Japan], but also by a fanatically hostile population,” astronomical casualties were thought to be inevitable. The losses between February and June 1945, just from the Allied invasions of the Japanese-held islands of Iwo Jima and Okinawa, were staggering: 18,000 dead and 78,000 wounded. That harrowing experience was accounted for while planning for the final invasion.
As I discussed this retrospective with him, a retired Marine four-star general recently told me this: “[F]ollowing Okinawa and Iwo Jima, all six Marine divisions were being refitted for the attack on the home islands. None of the divisions had post-assault missions, because the casualty estimates were so high that they would initially be combat inoperable until they were again remanned and refitted. Basically, the Marines were to land six divisions abreast on Honshu, then the Army would pass through for the big fight on the plains inland.” (Note: a division has approximately 23,000 Marines.)
He went on: “What made this different was unlike the Pacific campaign to date, other than Guadalcanal back in 1942, this would be the first time the Japanese could reinforce their units. After Guadalcanal, in the fights across the ocean, the U.S. Navy isolated the objectives so the Japanese could not reinforce. The home islands would be a different sort of fight, hence the anticipated heavy casualties.”
A study performed by physicist (and future Nobel Laureate) William Shockley for the War Department in 1945 estimated that the invasion of Japan would have cost 1.7-4 million American casualties, including 400,000-800,000 fatalities, and five to 10 million Japanese deaths. These fatality estimates were, of course, in addition to the members of the military who had already perished during almost four long years of war; American deaths were already about 292,000. The implications of those numbers are staggering: The invasion of Japan could have resulted in the death of more than twice as many Americans as had already been killed in the European and Pacific theaters of WWII up to that time!
The legacy of Little Boy and Big Man was in reshaping the course of history; a swift end to the war rendered Operation Downfall unnecessary.
Mounting Casualties, Both Non-Military And Military Alike
Over the past half-century, much has been made of the moral boundary that was breached by the use of nuclear weapons, but many military historians regard as far more significant the decisions earlier in the war to adopt widespread urban bombing of civilians. This threshold was initially transgressed by Hitler, who attacked English cities in 1940 and 1941; the practice was later adopted by the Allied forces, resulting in the devastation of major cities such as Dresden, Hamburg, and Tokyo.
Previously, the bombing had been focused primarily on military objectives, such as airfields, munitions factories, and oil fields; or on critical transportation links, such as train stations and tracks, bridges, and highways. Never before, on such a scale, had non-military targets been targeted in order to degrade the morale of the populace. In one instance, over 100,000 were killed in a single night of firebombing of Tokyo, March 9-10, 1945 (many bodies were incinerated and never recovered); more than 22,000 died, in Dresden, February 13-15, 1945, and about 20,000 in Hamburg, July 1943.
In an email to me, my former colleague historian Victor Davis Hanson called attention to two factors that made the case for the use of America’s nuclear weapons. First, “thousands of Asians and allied prisoners were dying daily throughout the still-occupied Japanese Empire, and would do so as long as Japan was able to pursue the war.” Second, “Maj. Gen. Curtis LeMay [who was in charge of all strategic air operations against the Japanese home islands] planned to move forces from the Marianas to newly conquered and much closer Okinawa, and the B-29 bombers, likely augmented by European bomber transfers after V-E Day, would have created a gargantuan fire-bombing air force that, with short-distance missions, would have done far more damage than the two nuclear bombs.”
In fact, the most destructive bombing raid of the war, and in the history of warfare, was the nighttime fire-bombing of Tokyo on March 9-10, 1945. In a three-hour period, the main bombing force dropped 1,665 tons of incendiary bombs, which caused a firestorm that not only killed some 100,000 civilians, but also destroyed a quarter of a million buildings and incinerated 16 square miles of the city. Tokyo was not the only target: By the end of the war, incendiaries dropped by LeMay’s bombers had totally or partially consumed 63 Japanese cities, killing half a million people and leaving 8 million homeless.
As President Harry Truman found in deciding whether to use Little Boy and Fat Man, sometimes you need to choose the least bad of the alternatives.
The WWII casualty statistics are numbing, and bring to mind a saying often attributed to Joseph Stalin: a single death is a tragedy, a million deaths is a statistic. We should not forget that many of the dead were non-combatant innocents. Equally as staggering are the numbers lost in active combat.
During World War I, Europe lost most of an entire generation of young men; combatant fatalities alone were approximately 13 million. Memories of that calamity were still fresh three decades later. In 1945, as they deliberated, Allied military planners and political leaders were correct, both strategically and morally, in not wanting to repeat that history. (And Truman, who had succeeded Franklin D. Roosevelt upon the latter’s death only four months earlier, would not have wanted his legacy to include causing the unnecessary death of hundreds of thousands of American servicemen.) It was their duty to weigh carefully the costs and benefits for the American people, present and future. Had they been less wise or courageous, my generation of post-war baby boomers would have been much smaller.
These kinds of decisions are truly the stuff of history, writ large, but governments perform such balancing acts all the time. We are seeing that today in the creation of policies to manage the COVID-19 pandemic. How, for example, do we balance the very real, and high, costs of lockdowns – businesses obliterated, livelihoods lost, children deprived of education and social stimuli – against increasing numbers of deaths, the persistent, debilitating, post-”recovery” effects of infections, and other spinoff impacts on health from COVID-19 infections, if reopening occurs too rapidly?
As Truman found in deciding whether to use Little Boy and Fat Man, sometimes you need to choose the least bad of the alternatives. To decide which of them will turn out to be best in the long term, you need to value data over ideology, politics, or vox populi.
I pity the decision-makers.
Note: An earlier version of this appeared in Human Events.
Henry I. Miller, a physician and molecular biologist, was a research associate at the National Institutes of Health and the founding director of the FDA’s Office of Biotechnology. You can find him online or on Twitter at @henryimiller, and read more of his writing at henrymillermd.org.
Comments are closed.