One of the most spectacular – and controversial – accomplishments of US technology has been the harnessing of
nuclear energy. The concepts that led to the splitting of the atom were developed by the scientists of many countries, but the conversion of these ideas into the reality of nuclear fission was accomplished in the United States in early 1940s, both by many Americans but also aided tremendously by the influx of European intellectuals fleeing the growing conflagration sparked by
Adolf Hitler and
Benito Mussolini in Europe.
During these crucial years, a number of the most prominent European scientists, especially physicists, immigrated to the United States, where they would do much of their most important work; these included
Hans Bethe,
Albert Einstein,
Enrico Fermi,
Leó Szilárd,
Edward Teller,
Felix Bloch,
Emilio Segrè, and
Eugene Wigner, among many, many others. American academics worked hard to find positions at laboratories and universities for their European colleagues.
After German physicists split a
uranium nucleus in 1938, a number of scientists concluded that a nuclear chain reaction was feasible and possible. In a letter to President
Franklin Roosevelt, written by Leó Szilárd and signed by Albert Einstein, warned that this breakthrough would permit the construction of "extremely powerful bombs." This warning inspired an executive order towards the investigation of using uranium as a weapon, which later was superseded during
World War II by the
Manhattan Project the full Allied effort to be the first to build an
atomic bomb. The project bore fruit when
the first such bomb was exploded in
New Mexico on July 16, 1945.
Along with the production of the atomic bomb,
World War II also saw the entrance of an era known as "
Big Science" with increased government patronage of scientific research. The advantage of a scientifically and technologically sophisticated country became all too apparent during wartime, and in the ideological Cold War to follow the importance of scientific strength in even peacetime applications became too much for the government to any more leave to philanthropy and private industry alone. This increased expenditure on scientific research and education propelled the United States to the forefront of the international scientific community—an amazing feat for a country which only a few decades before still had to send its most promising students to Europe for extensive scientific education.
The first US commercial
nuclear power plant started operation in
Illinois in 1956. At the time, the future for nuclear energy in the United States looked bright. But opponents criticized the
safety of power plants and questioned whether safe disposal of
nuclear waste could be assured. A 1979 accident at
Three Mile Island in Pennsylvania turned many Americans against nuclear power. The cost of building a nuclear power plant escalated, and other, more economical sources of power began to look more appealing. During the 1970s and 1980s, plans for several nuclear plants were cancelled, and the future of nuclear power remains in a state of uncertainty in the United States.
Meanwhile, American scientists have been experimenting with other
renewable energy, including
solar power. Although solar power generation is still not economical in much of the United States, recent developments might make it more affordable.
0 comments:
Post a Comment