Mad Science: The Nuclear Power Experiment Read online




  © 2012 Joseph Mangano

  Published by OR Books, New York and London

  Visit our website at www.orbooks.com

  First printing 2012

  All rights reserved. No part of this book may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage retrieval system, without permission in writing from the publisher, except brief passages for review purposes.

  Cataloging-in-Publication data is available from the Library of Congress.

  A catalog record for this book is available from the British Library.

  ISBN 978-1-935928-85-0 paperback

  ISBN 978-1-935928-86-7 e-book

  Typeset by Lapiz Digital, Chennai, India.

  Printed by BookMobile in the United States and CPI Books Ltd in the United Kingdom.

  The U.S. printed edition of this book comes on Forest Stewardship Council-certified, 30% recycled paper. The printer, BookMobile, is 100% wind-powered.

  Contents

  Prologue by Alec Baldwin

  Nuclear’s Rise and Fall

  Tiny Atoms, Huge Risks

  Soothing Big Bang Fears

  Big Meltdown in Hollywood’s Backyard

  Evidence Trumps the Big Lie

  Secrecy Cracks Reveal Carnage

  Danger Now, Danger Tomorrow, Danger Forever

  Trouble in Atomic Paradise

  Defibrillating a Corpse

  Red-Hot Legacy

  Appendix

  A Note from the Author and Acknowledgments

  References

  Prologue

  I grew up in a household with five siblings, and my father was a schoolteacher back in the 1960s and 70s.

  I remember my mother, who seemed to fret over money issues at every moment, would stress out about our utility bill. During the winter, she turned the heat down to fifty-five at night and was constantly admonishing us to turn off the lights. After sunset, the lights were on in our house on an “as needed” basis. We had to use power carefully, and in an age of fewer electronic gizmos and demands on power that we live with today. Most homes had one television. No VCR, no Xbox, no computers, no cell phones. My mother whined over any electric bill higher than thirty dollars. From the beginning, energy meant cost.

  So companies schemed about how to lower that cost. Nuclear power was presented, not as an answer, but as the answer. On Long Island, in the early 1970s, the power authority, the Long Island Lighting Company (LILCO), joined the tide of utilities that were applying for and constructing nuclear reactors in order to address the country’s dwindling domestic oil production and a looming US dependence on foreign oil. The Shoreham Nuclear Power Plant was started in 1973 and LILCO proposed that it would cost $75 million. By the time it was completed, in 1984, the cost to build the facility, which would be passed on to LILCO ratepayers (who were already paying among the highest rates in the continental US), was $2 billion.

  In addition to the mismanagement and usurious rates that were to be borne by Long Islanders, Shoreham was deemed inoperable due to an ineffective evacuation plan that would have forced Long Island’s large, residential population to bottleneck through New York City area roads, tunnels, and bridges in case of a catastrophic event involving the reactor. In 1989, Shoreham was closed in an agreement with New York State officials that passed on nearly all of the then-$6 billion cost of closing, decommissioning, and decontaminating the facility on to LILCO customers, who then officially became the highest commercial utility ratepayers in US history.

  The numbing debacle that is the story of Shoreham, and the nearly criminal enterprise that launched it (exiting LILCO executives paid themselves multimillion dollar bonuses in the wake of the fiasco), presents only one side of the sad and demoralizing history of the nuclear power industry that I have come to know over the past twenty years. Over time, I became involved with other community organizations in applying the “Shoreham Principle” to the closing of the research reactor at Brookhaven Lab. My association with various public interest groups has introduced me to rhabdomyosarcoma clusters near where I live, the arrogant and oft-penalized management of the Millstone nuclear facility in Waterford, CT (the recipient of the largest government fine in history: $2.1 million), and I learned about the Oyster Creek facility in Toms River, New Jersey, where soft tissue cancers and autism rates are significantly higher than the national average.

  Along the way, I have met many concerned and dedicated activists as well as scientists. Dr. Ernest Sternglass, Dr. Helen Caldicott, Jay Gould, Scott Cullen, Richard Webster, just to mention a few. My neighbor Christie Brinkley did yeoman’s work on the Oyster Creek public information programs that we worked on. However, the real thread in all of this, the constant, tireless voice that has helped keep me linked to reports from the front lines of the battle to expose the staggering risks to public health posed by nuclear power, has been Joe Mangano.

  I’ll let Joe give you the facts and his analysis of them. I simply want to state that Joe has been one of the most dedicated, intelligent, and evenhanded public activists I have ever known. He has also proven to be one of the most effective, on a battlefield where gains are measured in inches, if not centimeters. If not for Joe’s efforts, as well as those of quite a few other concerned citizens, Oyster Creek might have remained open indefinitely, spewing its special cocktail of nuclear byproduct all over the Ocean County, New Jersey community and adding more to the stockpile of waste that we still don’t know what to do with. As of now, the plant is scheduled to close in 2019, ten years earlier than its operator, Exelon, had originally sought in its extension applications. Part of that is Joe.

  Read this book and learn in a few hours what it took me years to cobble together from knowing this great public servant.

  —Alec Baldwin

  New York City

  April, 2012

  Nuclear’s Rise and Fall

  On a warm California morning – July 12, 1959 – just outside Los Angeles, workers at the experimental nuclear reactors at the Santa Susana Field Laboratory reported for duty. Those who worked on the Sodium Reactor Experiment, one of ten reactors at the lab, were eager to make the fourteenth in a series of tests of the futuristic machine. The experiment tried to cool uranium fuel in the reactor core with sodium rather than the standard method, which used water. Hopes were high that this design would revolutionize atomic power reactors in the US, and perhaps around the world.

  There had been problems with the first thirteen tests over the past two years. In the most recent one, there had been an explosion. The inexperienced workers tried to find its cause, but were unsuccessful. In retrospect, leaks had probably limited the ability to keep the sodium cool – a crucial aspect of running the reactor safely. But instead of taking the prudent path of stopping or postponing the test series, Test 14 was placed right on schedule, with no delays.

  Almost immediately after work began, something went badly wrong. The sodium was not cooling the reactor core, which heated up to levels never seen before at Santa Susana – or perhaps in the US. Inside the core were long rods, containing uranium atoms that were split to power the reactor, along with the dozens of dangerous radioactive chemicals formed as waste products when uranium atoms are split. The long rods began to melt, and a large volume of radioactive gas built up in the reactor room. Instead of shutting down the reactor and investigating the cause, technicians added additional rods designed to control the process, but these only made the situation worse.

  The next two weeks were a series of restarts and more meltdowns, similar to repeatedly banging one’s head against a brick wall. Why didn’t managers shut
down the Sodium Reactor Experiment? The answer was one common to the US atomic power program: adherence to a mentality of “the show must go on” in spite of any dangers. Every day, radioactive gases from holding tanks in the reactor building were released into the air – often at night, a highly toxic job given to workers on the “graveyard shift.” Finally, two weeks to the day after the run started, the reactor was finally shut down. The amount of radiation released into the environment was never accurately measured – in fact the entire meltdown was kept secret from the public for the next two decades. Sodium-cooled reactors failed; of the 439 reactors operating worldwide today, only two are sodium cooled (none in the US). But the damage had been done, and today the Santa Susana site sits amidst a large mess of toxic waste.

  The meltdown at Santa Susana is a microcosm of the entire US nuclear power program. It began with great enthusiasm for what the technology could do – so much enthusiasm that when problems began piling up, those in charge kept the program moving, convinced that the atom would be the answer to America’s future energy problems. But it wasn’t then, still isn’t, and never will be. The American nuclear power program is a failure. Why did nukes fare so poorly? Why were they allowed to cause such damage, and how is it the debate still continues? How a technology with such promise was allowed to cause such huge threats and ring up a staggeringly high tab in the process can only be understood by examining its origins.

  On a December day in 1953, President Dwight D. Eisenhower stood before the United Nations to deliver a critical speech on nuclear power. Eisenhower had been president less than a year, but had already become a crucial figure in the Cold War between capitalist and communist nations. Perhaps his most important achievement was that he had helped broker an end to the Korean War, permanently splitting that nation into two countries, the communist north and the capitalist south. Just one month after Eisenhower took office, Soviet dictator Joseph Stalin died, bringing the Cold War to another phase, one that offered more hope for reconciliation.

  Eisenhower presented a complex figure. He was a military man by training and experience, a graduate of West Point and a lionized figure who had served as a junior officer in the first World War, and as the supreme commander of all Allied troops in Europe during the second, elevating him to heroic status. Some worried about a soldier occupying the White House during this time. The military mindset during the early years of the Cold War was based on a strict interpretation of America’s ability to maintain military superiority. This superiority extended to nuclear weapons; the US had produced and used the first bombs, and those in the military community believed they should be used freely, despite the horrors at Hiroshima and Nagasaki.

  Just four years after Hiroshima and Nagasaki, the Soviet Union developed its own bomb, and successfully detonated it. In late 1952, the US exploded a thermonuclear (hydrogen) device, with explosive power 1,000 times greater than an atomic weapon. But just nine months later, the Soviets exploded their own super-bomb. The US had lost its monopoly, but maintained superiority in the numbers game. In the race to test and manufacture as many as possible as quickly as possible, an all-out effort gave America the lead. By the end of 1953, the US had tested forty-four bombs to the Soviet total of just three. The US had amassed about 1,000 nuclear weapons, while the Soviets had just a handful. And the United Kingdom, allied with America, had also begun testing and stockpiling nuclear weapons. A number of military leaders believed that nuclear war was inevitable, and even should proceed while the US had such superior numbers.

  The modest sense of security that a monopoly on atomic weapons gave people was ebbing, even though a lead was maintained in the count of nuclear weapons. The image of a hostile communist regime exploding an atomic bomb and then building a stockpile of these weapons was disturbing to many Americans. Scientists began to look beyond current numbers and envision a fearful situation, with both nations able to destroy the other through attacks that were indefensible. A group at the University of Chicago instituted the “Doomsday Clock” to measure proximity to global disaster. The initial 1947 setting of seven minutes to midnight was reduced to two minutes by 1953 as the arms race heated up.

  Had Eisenhower subscribed to the military model, he would have taken the aggressive stance that it was possible, even desirable, to win an inevitable nuclear war. A part of his policy fit this model. He continued Harry Truman’s program of developing a large arsenal of nuclear weapons with the utmost speed. A series of eleven aboveground atomic bomb explosions had taken place in Nevada during the spring of 1953, and similar programs would follow in 1955, 1957, and 1958. Another series of shots were being planned for the Marshall Islands in the south Pacific, including what would be a hydrogen bomb with an equivalent yield of 1,000 Hiroshimas. During Eisenhower’s eight years in office, the number of US nuclear weapons grew from 1,000 to 20,000.

  But the new President was a complex man, not easily categorized, who deviated from a completely hardline policy. He had been shaken by the carnage of the recently-ended World War, and had opposed Truman’s usage of nuclear weapons on mostly civilian targets in the Japanese cities of Hiroshima and Nagasaki.

  In 1945 Secretary of War Stimson, visiting my headquarters in Germany, informed me that our government was preparing to drop an atomic bomb on Japan. I was one of those who felt that there were a number of cogent reasons to question the wisdom of such an act… first on the basis of my belief that Japan was already defeated… and secondly because I thought that our country should avoid shocking world opinion by the use of a weapon whose employment was, I thought, no longer mandatory as a measure to save American lives.

  He also understood the growing public fear over the nuclear arms race. So Eisenhower’s mission to the U.N. that December day was to soothe fears. He could not get around the fact that the atom represented a devastating power the world had never seen, nor could he deny that a race for nuclear superiority between two hostile nations was under way and was gathering momentum. However, he could inform the public that there were uses of this new technology that would help, rather than harm, humans – thus giving rise to the phrase “peaceful atom.”

  Eisenhower first spoke at length about the dangers of atomic bombs, and the responsibilities of leaders of nations with the bomb to reduce or eliminate these dangers. He then turned to another approach that went beyond just control of nuclear weapons:

  The United States would seek more than the mere reduction or elimination of atomic materials for military purposes. It is not enough to take this weapon out of the hands of the soldiers. It must be put in the hands of those who know how to strip its military casing and adapt it to the arts of peace. The United States knows that if the fearful trend of atomic military build-up can be reversed, this greatest of destructive forces can be developed into a great boon, for the benefit of all mankind.

  The United States knows that peaceful power from atomic energy is no dream of the future. That capability, already proved, is here – now – today. Who can doubt, if the entire body of the world’s scientists and engineers had adequate amounts of fissionable material with which to test and develop their ideas, that this capability would rapidly be transformed into universal, efficient, and economic usage.

  The President also proposed an international Atomic Energy Agency, most likely to be operated by the United Nations, which would “… devise methods, where by this fissionable material would be allocated to serve the peaceful pursuits of mankind. Experts would be mobilized to apply atomic energy to the needs of agriculture, medicine, and other peaceful activities. A special purpose would be to provide abundant electrical energy in the power-starved areas of the world.”

  The speech was widely hailed. Eisenhower had taken a stand as a peacemaker, while not backing down from the realities of having to be a Cold Warrior leading a nuclear arms race. If the genie was out of the bottle, and atoms were now part of the world, its destructive uses would be minimized and its constructive uses maximized.

  Eisenhow
er was trying to envision the future world’s energy needs. The nation was growing rapidly, in terms of its population. One reason for this was the number of babies. After years of low birth rates during the Great Depression and World War II, the Baby Boom had been in full swing for nearly a decade. Men and women were marrying and having large families. A decade earlier, there were just over 2.5 million births a year. By 1953, the number had nearly reached 4 million. There was no end in sight to the boom. The growing number of people – especially society’s younger members, who required more institutions like offices and schools – meant a greater need for electricity.

  Another reason accounting for growing energy needs was the exodus to the suburbs. The economy, which was finally prospering for the first time since the late 1920s, put more money in people’s pockets, allowing many to move out of smaller, group dwellings in densely populated cities into larger, single-family quarters in fast-growing suburban areas. Bigger homes needed more electricity to heat and cool them. More people used televisions and other electronic products in the household, including refrigerators, stoves, dishwashers, and washing machines. Living in the suburbs meant more people had to purchase and use private cars, instead of the public transportation so predominant in cities. The cars of that period consumed energy in enormous quantities compared to those of today.

  The improved economy also meant greater energy needs in the workplace. More people working meant more offices that needed more power. Manufacturing industries were still predominant over service industries, and the workplace was becoming more mechanized. Even agriculture continued to mechanize. More machines requiring energy were needed to produce goods, along with the basics of lighting, heating, and cooling offices.

  Another factor that was driving up energy demand was that Americans were living longer lives. Improved health care and living conditions had been lowering death rates, especially in infancy. More babies who survived and grew to have children of their own helped boost the population, and raise energy needs. Lower death rates among adults meant more elderly people, which also translated into greater energy needs, especially in settings like hospitals; the post-World War II era featured a hospital building boom. The nation’s population had reached 150 million by 1950 and was soaring. Predictions of this number doubling to 300 million within several decades were being made. Once the Baby Boomers began having babies themselves, the numbers were sure to soar even more.