The perfect genie

Grey atomic orb on black background

Dr Adam Broinowski

How was the unprecedented destructive force of atomic weapons transformed into a ‘force for good’? The strange story of nuclear power portrayed as a perfect 'genie' offers a useful example of how modern societies, and the United States in particular, have applied the idea of utopia. It illustrates how dreams of an eternal supply of energy and the improvement of living conditions have been used to concentrate intellectual and technical knowledge and economic resources in the pursuit of nuclear power.

The first successful human-made self-sustaining nuclear chain reaction was conducted by a team of American and British scientists led by physicist Enrico Fermi, in the world’s first nuclear reactor, the Chicago Pile-1, at the University of Chicago on 2 December 1942. On 16 July 1945, a weaponised atomic explosion known as the Trinity test was conducted on the arid plains of the White Sands Missile Range in southern New Mexico, demonstrating the United States’ sole possession of this destructive force. United States leaders then decided to use such atomic bombs on the cities of Hiroshima and Nagasaki on 6 and 9 August 1945.

In his radio report to the American people on the Potsdam Conference on 9 August 1945, President Harry S Truman declared that the ‘secret’ of this seemingly infinite and awesome force had been entrusted to the United States, apparently giving it the exclusive right to manage this force on behalf of humankind.

In a complex Cold War environment, the United States Government decided to refocus Americans on a future supported by a more peaceful or brighter side of atomic technology.

Truman said:

“It is an awful responsibility which has come to us. We thank God that it has come to us instead of to our enemies; and we pray that He may guide us to use it in His ways and for His purposes.”

At this moment, the United States was at its peak in terms of relative wealth and military power. However, in the same speech Truman recognised that what was done to Japan was only “a small fraction of what would happen to the world in a third World War” and was emphatic that no such ravages should be suffered in future by the people of the United States. Anticipating that the public would be anxious about a future war with atomic weapons, and the likelihood of them being used on American cities, he disclosed that a committee led by Secretary of State James Byrnes had already laid plans to control the use of the atomic weapon to ensure “the protection of US interests and those of world peace”. The United States Government thus faced the problem of how to mobilise public support to pay for an extensive nuclear weapons arsenal and its planned foreign military bases as part of the new National Security State.

At the same time, United States officials were reluctant to thoroughly address the darker side of America’s newly demonstrated capacity: was it legal, conscionable and necessary to have used atomic weapons as part of area bombing campaigns on primarily civilian-occupied cities in the closing stages of the war? Aside from the notable work of a few journalists and commentators in the immediate aftermath and early post-war years, this sort of reckoning, which became ‘taboo’ in Japan during the United States-led occupation, was seriously neglected in the United States and in other countries. Similarly, the Japanese Government during the occupation was reluctant to examine more deeply the atrocities and crimes against humanity committed by Imperial Japanese military forces during the Asia-Pacific War (1931–1945).

In a complex Cold War environment, the United States Government decided to refocus Americans on a future supported by an apparently more peaceful or brighter side of atomic technology. The Atomic Energy Act of 1946 (McMahon Act) gave the United States Atomic Energy Commission (AEC) monopoly control over the country’s developments in the field of atomic energy.

One of the AEC’s missions was to develop a public relations campaign to replace the negative image of atomic weapons as destructive (the ‘bad atom’) with a positive image (the ‘good atom’). That is, to transform negative images of atomic destruction into positive images of a potentially infinite source of atomic energy and, as if by doing so, assume in the eyes of the American people and others, the position of rightful inheritor of world leadership and navigator for humankind to some kind of prelapsarian grace. This project revealed American magical thinking at work. Invoking American cultural and moral traditions of pragmatism, individualism and industriousness, the United States framed nuclear technology as a privileged responsibility and unprecedented opportunity for American scientists to lead the way in creating a utopian society in America and then around the world. For the war generation who had suffered deprivation and sacrifice, what could be worth more than the final arrival in an earthly paradise of peace and abundance under careful and diligent scientific guidance?

United States Government-sponsored institutions and the cultural industry went into overdrive. The Atomic Energy Commission’s Atoms for Peace Program, launched by President Dwight D Eisenhower at the United Nations General Assembly on 8 December 1953, continued to frame nuclear technology in the binary of divine darkness and light:

“Occasional pages of history do record the faces of the ‘great destroyers’, but the whole book of history reveals mankind’s never-ending quest for peace and mankind’s God-given capacity to build … so my country’s purpose is to help us move out of the dark chamber of horrors into the light, to find a way by which the minds of men, the hopes of men, the souls of men everywhere, can move forward towards peace and happiness and well-being … salvation cannot be attained by one dramatic act … many steps will have to be taken.”

In 1954, Eisenhower signed a revision to the McMahon Act that allowed the commercial development of nuclear technology, aiming to make it more competitive with oil and coal-fired electricity production. America’s most influential scientific bodies and major United States broadcast and broadsheet outlets, including TimeNewsweek, Collier’s, Life and The Saturday Evening Post, adopted a ‘town hall’ promotion campaign. In the 9 August 1955 edition of Look magazine, author David O Woodbury described his generation as living “between Hell and Utopia” in which the “human mind shuttled between doom and dreams of bounty”, the “very force that can destroy the human race could create miracles of hitherto unimagined possibility”.

To fuel the campaign, promoters drew from an American origin story in which a sixteenth-century European Puritan community was founded by John Winthrop to resemble a New Jerusalem, “a city upon a hill, the eyes of all people are upon us”. Yearning to be free of the vicissitudes of famine, plague and wars of the Old World, this European settler population believed they were following their manifest destiny as a chosen people to discover an earthly paradise. Projecting this quasi-mystical ideal onto their newly discovered vast and ‘empty’ wilderness, they set about taming and cultivating this pre-occupied land to create a New World model they thought would be exceptional.

Traces of this potent mythopoeia could be seen in mainstream American cultural discourse on atomic power in the deployment of the ‘good atom’ campaign during the early Cold War, as pointed to by historian Paul S Boyer. For the Disney generation – who grew up in the early years of the Atomic Age – family friendly films, advertising, popular stories and political rhetoric promised the imminence of an idyllic world of techno-scientific progress. With the newly acquired capacity to harness the apparently quiet, clean, cheap and peaceful ‘genie’ of atomic energy – much like animal, human and coal-fired energy that had been harnessed in the past – there seemed no limit to what the atom could do. In iconic representations such as A is for Atom (directed by Carl Urbano, 1953) and Our Friend the Atom (directed by Hamilton Luske, 1957), rational scientists and engineers calmly demonstrated the multiple roles the genie could perform – “warrior, engineer, farmer, healer” – so as to ensure the further development and progress of the nation through the strength, authority and unlimited power of nuclear technology.

The primary goal of this campaign was to continue to increase United States military power and influence in the world. As the Soviet Union, which had successfully tested an atomic bomb on 1 August 1949 and also promised peaceful uses of nuclear power, was considered a potential rival, the American public relations campaign projected a vision of incandescent nuclear-powered cities radiating out from America across the surface of the globe.

Following the launch of the world’s first nuclear powered submarine by the United States in 1952, Atomic Energy Commission chief, Admiral Lewis L Strauss, promised that compact and portable commercial nuclear power stations would produce electricity that was so inexpensive as to be “too cheap to meter”. Meanwhile Eisenhower’s program included a world atomic bank to supply client countries with the fuel to build atomic reactors, particularly in ‘developing’ countries.

For many in North America, this was a time of technological fantasies big and small. These included atomic-powered cars, aeroplanes and ships to overcome long distances, controlled use of nuclear bombs for excavating mountains, glaciers and canals, medical research using nuclear isotopic tracers, metallurgical engineering applications to detect invisible flaws in products, radio-genetics to increase crop yield, nuclear energy supply to desalinate water and irrigate remote locations, and self-sustaining nuclear-powered human colonies in space by the year 2000.

Amid such excitement, it is ironic but not so surprising that a public majority in Japan seemed also to be persuaded by such utopian projections involving nuclear technology. At one point, the city of Hiroshima was marked for Japan’s first nuclear reactor. In a feat of social engineering by teams of United States ‘Japan-hands’ (officials working for the United States Embassy and various intelligence agencies) and Japanese political and media industry leaders (some of whom founded the Liberal Democratic Party), the nation that had been exposed to atomic weapons in wartime became one of the earliest and leading converts to nuclear energy.

Through large-scale government-sponsored exhibitions and newspaper promotional campaigns, the idea that Japan was hampered by comparatively inferior technological capacity and energy insecurity and which had led to its bitter defeat in World War II, was also used to propitiate the nation’s turn to atomic energy. In the same binary, the Japanese public learned to separate the military and civilian uses of the atom.

As several researchers including Ran Zwigenberg and Ryan Holmberg have noted, as Japan’s nuclear industry took shape in 1955, government-sponsored advertisements, manga, films and educational pamphlets, including those by the grandfather of manga Tezuka Osamu, played down the destructive aspect of atomic weapons while promoting the marvels of atomic science and nuclear engineering.

Although there was a building momentum of voices speaking out against nuclear testing in Japan by the 1950s–1960s, a ‘nuclear village’ of industry leaders, government officials, scientists and academics had already embedded a pro-nuclear narrative. According to this narrative, the economic costs to the taxpayer and environmental impacts were worthwhile to retain a degree of energy security in a resource-poor nation and to secure and sustain a consumerist lifestyle enjoyed by ‘first-world’ nations. Ultimately 54 nuclear reactors were built atop narrow, volcanically seismic islands regularly exposed to tsunami.

Projected priorities of staying cool in summer and keeping trains running on time seemed to assuage public concern for possible nuclear disasters. Such disasters were already occurring, however, with many kept secret and their damage minimised. Three Mile Island (Pennsylvania, 1979), Chernobyl (Kiev Oblast, 1986) and Fukushima Daiichi (Fukushima Prefecture, 2011) were considered by the World Nuclear Association in May 2018 to be the only major nuclear accidents across “17,000 cumulative reactor-years of operation in 33 countries”. But in 1952 the nuclear meltdown at Chalk River, Ontario released 100,000 curies – far more than the 15 curies released at Three Mile Island. There were three major accidents in 1957 in Rocky Flats (Colorado), Windscale (Cumbria) and Chelyabinsk (Chelyabinsk Oblast), which contaminated large areas occupied by civilian populations with long-lived radionuclides. One nuclear meltdown in Santa Susana Field Laboratory, California in 1959, which vented radioactive gases, was kept secret for 20 years. Another in Church Rock, New Mexico in 1979 contaminated local rivers and was considered the worst incident of radiation contamination in United States history. An explosion in Tomsk, Russia in 1993 irradiated villages in Siberia. In Fukui Prefecture, Japan, the Monju accident in 1995 caused a sodium fire and the Tokai-mura accident in Ibaraki Prefecture in 1999 killed several workers and distributed neutron radiation through a densely populated area.

These are only some of the significant radiological events in which plutonium and other radionuclides have been released into the earth system. Nuclear weapons testing, standard venting of contaminated water and gases from hundreds of operating reactors, ocean, river and ground releases of radioactive wastes, and leakage of (temporary) storage of various grades of nuclear waste, have all added to the nuclear burden.

Aiming and working toward a perfectly comfortable lifestyle for all, and for future generations, is an admirable and achievable pursuit. But it is unwise to become overly dependent on a highly polluting energy technology that is propped up by well-funded publicity and has proven to be neither failsafe nor cheap. This is worth bearing in mind regarding the Australian Government’s serious consideration of the proposal to “establish used nuclear fuel and intermediate level waste storage and disposal facilities in South Australia” alongside its obligations to store and manage locally generated nuclear waste as set out in the National Radioactive Waste Management Act 2012.

Rather than investing in campaigns to promote nuclear power as the ‘perfect genie’, it would be of greater benefit to all, now and in the future, to properly invest in the safest, least costly (as measured across all factors), least polluting and most reliable energy systems that we as a species can muster. Unlike the public relations alchemy that magically turned the ‘bad atom’ into the ‘good’, this transformation would be worth having.


Dr Adam Broinowski is a lecturer and visiting research fellow in the Department of Pacific and Asian History at the School of Culture, History and Language, ANU College of Asia and the Pacific. He was Chief Investigator for an Australian Research Council Discovery Early Career Researcher Award project which examined the social and cultural responses to the Fukushima Daiichi nuclear disaster in Japan in the context of radiological events since 1945. He is the author of the monograph Cultural Responses to Occupation in Japan: The Performing Body during and after the Cold War (2016) based on his work with a leading Japanese theatre company and subsequent doctoral research. He is currently working on transnational contemporary and historical discourses on nuclear power and climate disruption.

Header Image: Milimoments after detonation. Photo: Dr Harold Edgerton, Rapatronic Camera picture, nuclear explosion. Federal Government of the United States [public domain].

Attachments