Show Summary Details

Page of

PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).

date: 05 June 2020

Environmental and Energy Policy

Abstract and Keywords

This chapter explores the interrelated nature of the environmental and energy policies in the United States, particularly since 1900. As new technologies made new sources of energy viable, formal and informal political arrangements and laws were used to prioritize the United States’ ability to acquire necessary supplies. During the twentieth century, the essential need for energy has defined the concept of geopolitics and even served as a rationale for war. By the 1970s and 1980s, a separate policy mandate moved environmental concerns into formal local and federal politics. Although these applications of policy developed distinct from one another, a general chronology of their development from 1900 to the present demonstrates the growing interplay between environmental policy and energy management. In the twenty-first century, a new paradigm of economic development has moved energy and environmental policies together to wrestle with complex issues, including the sustainability of energy supplies and climate change.

Keywords: energy, resource conservation, public lands, national parks, automobility, roads, geopolitics, ecology, climate change, sustainability

Throughout the developed world, the phenomenal economic growth of the twentieth century was made possible by new, expansive sources of energy. Primarily this growth can be attributed to the use of fossil fuel energy, especially petroleum and coal. Though technology and innovation is essential to accessing these energy supplies, formal and informal political arrangements and laws were used to prioritize these nations’ ability to acquire the needed energy. Throughout the twentieth century, much of the political maneuvering happened out of the public eye; today, however, developed nations do not try to conceal the politics of oil and natural gas. The essential need for energy has defined the concept of geopolitics and even served as a rationale for war.

The policies overseeing the acquisition and development of energy resources have emerged out of the formation of regulations and expectations governing Americans’ impact on the use of the natural environment at large. Policies to regulate the environment have grown from scarcity as well as from changes in our scientific knowledge. Although these applications of policy developed separately, a general chronology of their development from 1900 to the present demonstrates the growing interplay between environmental policy and energy management.

Forests as an Energy Resource

The prehistory of policies to manage energy and the environment can actually be traced to early efforts to manage forests. In fact, prior to the formation of the United States, the British Surveyor General in 1691 enacted the regulations that became known as the Broad Arrow Laws, which were then extended to all of New England, New Jersey, and New York in 1711. Under this regulation, “ALL trees of the diameter of 24 inches and (p. 427) upward at 12 inches from the ground, growing upon any soils or tracts of land” were reserved for the crown. Although the primary use of such timber was as masts for sailing ships, forest conservation would also entice policymakers to action at the end of the 1800s, when the boom in building, particularly in the American West, increased lumber development through 1900. For instance, Oregon doubled its population between 1890 and 1910. In the Pacific Northwest, timber production in the 1880s increased to 2.6 billion board feet per year. By 1910, Oregon and Washington ranked in the top three lumber-producing states in the United States.

This interest in forest conservation arrived at the federal government during the 1870s. The formation of the Division of Forestry within the Department of Agriculture during this decade proved to be a fairly insignificant political step. A corrupt federal bureaucracy overwhelmed any bona fide efforts at conservation. Instead, the division was used to more readily sell off timbered tracts of land. When Bernard Fernow became the chief of forestry in 1886, he brought with him a commitment to the scientific management of American forests. Although the division continued its primary effort at disseminating public information, it soon added scientific experimentation to the division’s responsibilities. President Benjamin Harrison established the first timber land reserve on March 30, 1891, and placed it under the control of the General Land Office rather than the Division of Forestry. The president now had the authority to set aside public land in the West. By 1897, 40 million acres in the Northern Rocky Mountains had been set aside. That year, Congress moved the national forest reserves into the Department of the Interior’s General Land Office.

Many of Fernow’s scientific ideas were incorporated into the national forest reserves by the first American trained in forestry, Gifford Pinchot. Pinchot replaced Fernow as director in 1898 and set out to force private timber firms to undertake cooperative forestry management on the reserves and reforestation programs on private lands. Pinchot’s philosophies influenced the policy initiatives of the nation when his close friend, Theodore Roosevelt, became president in 1901. In general, Roosevelt introduced a model of governance that centralized power at the federal level and became known as Progressivism. By 1905, conservation of forests became one of the nation’s greatest symbols of Progressive reform as Roosevelt oversaw the transfer of 63 million acres of reserves into Pinchot’s domain and renamed the agency as the U.S. Forest Service. In 1905 President Theodore Roosevelt transferred the national forest reserves from the General Land Office to the Department of Agriculture.

Pinchot organized the Forest Service quickly and in 1907 renamed the forest reserves as the “National Forests.” During the following year, six district or regional offices were organized in the West for administering fieldwork. These National Forests, however, considered their priority to be ensuring that the nation would maintain an abundant timber supply in perpetuity. The National Forests were similar to vaults holding a natural resource of vital importance to the nation and had little similarity to national parks.

(p. 428) Defining National Parks

Related to conservation, the well-intentioned, upper-class interest in natural areas that developed in the late nineteenth century also held an idealistic connection to romanticism and transcendentalism that had been founded in American literary and artistic traditions. Unlike the National Forests, this idea of preservation took shape in America’s first national parks. Despite these ideological origins, the earliest national parks had little if any unifying philosophy or ethic.

Increasingly, this lack of a mandate would trouble the development of the parks and, at times, such as the 1913 effort to construct the Hetch Hetchy dam, threaten their existence. The first bona fide step toward a unifying idea came with passage of the National Park Service Act in 1916. This act created the National Park Service (NPS) as a unit of the Department of the Interior, staffed no longer by military personnel but now by specially changed rangers (though this change would not be truly noticeable until later in the century). Stephan T. Mather, a businessman, was appointed as the first NPS director. In addition to creating a unifying mission based in preservation, Mather also sought to develop parks as certifiable tourist attractions. By mid-century some critics had even come to criticize overcrowding in the parks. Preservationist organizations such as John Muir’s Sierra Club and the Land Conservancy would argue for as little use as possible; others argued that the national parks were a trust open for the use of any citizen. This meant Americans had every right to use the sites as they saw fit—including the construction of dams or the harvest of energy resourcs.

As environmental policy moved forward through the mid-1900s, an event similar to Hetch Hetchy further defined the sanctity of the national parks and monuments. In the late 1940s, the Bureau of Reclamation, an agency developed by applying Pinchot’s idea of conservation to waterways of the American West, set out to construct the Echo Park Dam along the Utah-Colorado border, located within a little-used national monument named “Dinosaur”—even though most of its fossils and bones had been stolen. As Congress neared a vote on the issue in 1950, 78 national and 236 state conservation organizations expressed their belief that the national parks and monuments were sacred areas. With mail to Congress late in 1954 running at eighty-to-one against the dam, the bill’s vote was suspended and the project eventually abandoned. The young environmental movement had recorded its first clear-cut victory for the national park idea as a priority over development.

Creating the Public Lands: Energy for All?

One of the other early intersections of energy harvest and policy occurred in the ongoing nineteenth-century discussion over public lands and what constituted proper use. (p. 429) Similar to the Homestead Act, the General Mining Law of 1872 provided incentives for settlement and development, which allowed private exploitation of “all valuable mineral deposits in lands belonging to the United States” at marginal cost. Typical of legislation of this era, the General Mining Law emphasized individual rights with minimal government management. The law prioritized free and open access to the land and relatively unlimited opportunity to explore and develop resources. Developers were offered access to permanent, exclusive resource rights and exempted from paying royalties to the federal government. Clearly, the law was based on the simple motive to spur development.

The first energy resource to be granted unique status on public lands was coal. The Coal Act of 1864 gave the U.S. president authority to sell coal beds or fields for no less than $20/acre. Lands not sold in this fashion could be available to the general public at the minimum price under the general land statutes. In 1873, Congress authorized citizens to take vacant coal lands of not more than 160 acres for a minimum of $10/acre. This law governed the disposal of coal lands until 1920. Precious metals were governed much more freely. In all probability, the rationale behind this distinction lay in the relative ease of locating coal compared to that of most precious metals.

The importance of the coal supply to national development would make it a major political issue in the early twentieth century. In rapid succession, the administration of Theodore Roosevelt proposed controlling the federal grazing lands; increasing the national forests; and withdrawing the coal, oil, and potash lands from sale. In this same period, the administration also reserved potential waterpower sites and set out to reclaim the arid lands of the West. In 1906, Roosevelt faced up to the rumors of random exploitation of coal supplies on federal lands by withdrawing coal-bearing lands from public availability. He responded to industry’s complaint that the acreage limitations in the 1873 statute impeded the development of certain types of coal. As a result, many coal and railroad companies evaded these limits by using dummy entries or entries processed under the agricultural land statutes. To keep certain companies from acquiring a monopoly and to conserve mineral supplies, the withdrawals began in November 1906.

Laissez-Faire Leads to Standard Oil

Outside of public lands, methods for organizing energy development grew from the efforts of one of the most effective businessmen in history, John D. Rockefeller. In the early 1870s, as oil exploration expanded from the Oil Creek area of Pennsylvania to other states and nations, Rockefeller laid the groundwork for his effort to control the entire industry at each step in its process. By 1879, his company, Standard Oil, controlled 90 percent of the U.S. refining capacity, most of the rail lines between urban centers in the Northeast, and many of the leasing companies at oil exploration sites. Through Rockefeller’s efforts and the organization he made possible, petroleum became the primary energy source for the nation and the world.

(p. 430) But John D. Rockefeller and Standard Oil first demonstrated the possible domination available to those who controlled the flow of crude oil. As, one by one, Rockefeller put his competitors out of business, his corporation grew into what observers in the late 1800s called a trust (today, a monopoly). Able to configure regional, national, and even global politics to their needs, Standard’s reach extended throughout the world and it became a symbol of the Gilded Age—the era when businesses were allowed to grow too large and benefit only a few wealthy people. President Theodore Roosevelt, who took office in 1901, led the Progressive interest to involve the federal government in monitoring the business sector. “Muckraking” journalist Ida Tarbell’s History of the Standard Oil Company produced a national furor over Rockefeller’s unfair trading practices. Roosevelt used her information to enforce antitrust laws that would result in Standard’s dissolution in 1911. The subsidiaries created by this breakup eventually became Mobil, Exxon, Chevron, Amoco, and Conoco, among others.

These and other transnational corporations designed political arrangements that were entirely arranged around facilitating oil development wherever the commodity was found. This flexibility allowed petroleum development to expand significantly during the twentieth century, which also increased the amount of crude that was available.

Filling the Tank and Hitting the Road

With the discovery of massive quantities of crude in East Texas around 1901, petroleum transitioned to an integral role in American life, largely with the help of federal and state policies and laws. Experiments with the internal combustion engine, combined with petroleum’s convenience of handling, soon made it the American choice for powering the new personal transportation devices known as automobiles. From road building to drive-thru fast food, automobility became a determining characteristic of twentieth-century life. This development was clearly believed to be in the nation’s best interest. An important portion of the political scene became the use of federal, state, and local taxes to construct roads after the Federal Road Act of 1916. This process of road building began what some historians have called the “largest construction feat of human history,” and the American road system unfolded throughout the early twentieth century.

Beginning in the 1920s, legislation created a Bureau of Public Roads to plan a highway network to connect all cities of fifty thousand or more inhabitants. Some states adopted gasoline taxes to help finance the new roads. These developments were supplemented in the 1950s when President Dwight D. Eisenhower included a national system of roads in his preparedness plans for nuclear attack. This development cleared the way for the Interstate Highway Act to build a national system of roads unrivaled by any nation.

Road building initiated related social trends that added to Americans’ dependence on petroleum. Most important, between 1945 and 1954, nine million people moved to the suburbs. The majority of the suburbs were connected to urban access by only the automobile. Between 1950 and 1976, central city population grew by ten million while suburban growth was 85 million. Housing developments and the shopping/strip mall (p. 431) culture that accompanied decentralization of the population made the automobile a virtual necessity. Shopping malls, suburbs, and fast-food restaurants became the American norm through the end of the twentieth century, making American reliance on petroleum complete. Americans now were entirely wedded to their automobiles, which allowed prices of petroleum to impact American life more than any other nation.

Because petroleum is present in each American’s living environment, the dangers of this volatile chemical were revealed earlier than those related to other chemicals. Manufacturers in the 1920s added lead to gasoline to improve the power and performance of early gas engines. Scientists soon warned that dangerous lead poisoning could result from the lead in automobile exhaust. However, it would be decades until the impact on public health could be measured and quantified, and still more until legislation was created to force manufacturers to remove the lead from their product.

The Great Depression and the New Deal Extend the Conservation Ethic

“The economic crash of 1929 and Great Depression that followed,” writes historian William Andrews, “created a radically new context for federal environmental management.” Economic collapse (a 50 percent drop in national income over four years and an estimated 25 percent of the workforce unemployed) was compounded by environmental disaster. Andrews adds: “Under these circumstances, Congress and the public were prepared to support almost any decisive presidential leadership that might provide solutions. … [And] environmental policy initiatives were explicit elements of this agenda.”1 Clearly, the 1930s marked a revolutionary expansion of the role of the federal government in environmental policy. From the era of Progressive reform, economic and military needs of the 1930s and 1940s are at the root of even more extreme models of government intervention and regulation. Often, crisis demanded action of some sort, and these new policies fell in line with a clear progression begun decades earlier.

In the 1930s, to conservation and preservation New Dealers added a holistic approach to such concepts (and others) in a systematic fashion that might be applied across the nation. Growing out of the Progressive era of the first decade of the 1900s, the policies of the New Deal were intended to be about new approaches to managing natural resources; however, the pace of change and the use of the federal government to administer significant portions of everyday life left many Americans concerned. The policies of the New Deal, therefore, did not arrive without debate and criticism. In fact, the emphasis on centralized federal authority and regulation that began during the 1930s continues to be debated in the twenty-first century.

When President Franklin D. Roosevelt (1933–1945) took office, he intended to cut waste and find new opportunities for economic development. To carry out such plans, he sought the advice of modern-thinking experts from numerous fields. Looking into (p. 432) colleges and universities, Roosevelt inserted intellectuals immediately into the emergency of the Great Depression. With a long-term interest in the science of forestry and resource management, Roosevelt was particularly struck by the waste of American natural resources at a time of great need. In his inaugural address, the president stated: “Nature still offers her bounty and human efforts have multiplied it. Plenty is at our doorstep, but a generous use of it languishes in the very sight of the supply.”2 His initiatives sought to intelligently utilize these resources while creating jobs for out-of-work Americans. These policies incorporated the emerging ecology with federal policies to manage watersheds, maintain forests, teach agriculture, and hold fast the flying soils of the southern plains. The main impetus for federal action derived from a national surge in joblessness. The economic collapse of 1929 left millions of Americans incapable of making a living. Nowhere was this more evident than on the American southern plains.

Terrible drought combined with economic difficulty to make many farmers in the rural Midwestern United States incapable of farming. Residents of Oklahoma fled westward to California, creating resettlement problems as well. In the southern plains, the loose topsoil was lifted by heavy winds creating dust storms of epic proportions. Press coverage of the Dust Bowl of the 1930s presented a natural disaster caused by drought and bad luck. Through government-made documentary films, such as The Plow That Broke the Plains, the New Deal infused a bit of ecological background to explain desertification and agricultural practices that can be used to combat it. In the process of a natural disaster, the American public learned a great deal about its role within the natural environment. Thinking that proper land use could be taught, the federal government installed extension agents to educate the public.

This was also apparent in New Deal river projects, particularly the Tennessee Valley Authority (TVA). Finally, Franklin D. Roosevelt’s pet project, the Civilian Conservation Corps (CCC), merged the impulses of Progressives, such as President Theodore Roosevelt’s trust in the importance of work in the outdoors for the development of young Americans, with scientific understandings of agriculture and watershed management. The CCC projects often grew from lessons of ecology—for instance, the need to construct “shelter belts” of trees to help block the wind on the plains and to keep topsoil in place—but their most important priority was creating employment for young men.

Overall, the emergence of ecology had brought a new utility for science in the everyday life of Americans. Scientific knowledge, however, was still largely controlled by experts, often working for the federal government. Policymakers set out in this era to create an active approach to using laws and regulations to control and stimulate wise use of natural resources. Many of the priorities that took shape in the 1930s continue to form the foundation of contemporary environmental policy.

Basing Policy in Ecology

By the early 1960s, environmental concern was becoming one of the most ubiquitous causes of the grassroots public. Unlike conservation of the 1800s, 1960s environmentalism (p. 433) grew from the demands of middle-class activists. Pollution composed the most frequent environmental complaint, but its nuisance derived more from physical discomfort than a scientific correlation with human health.

A government biologist turned nature writer presented the American public with its lesson in science in 1962. Rachel Carson’s Silent Spring erupted onto the public scene to become a bestseller after first being serialized in The New Yorker. The story of pollution (particularly that from the popular pesticide DDT) and its effect on ecological webs of life linked water runoff to fish health and then to depletion of the bald eagle population. Readers were left to infer the effects of such chemicals on humans.

Flexing their increased environmental awareness, the American public scurried to support Carson’s parade through television talk shows. The Kennedy administration appointed a commission to study Carson’s findings and a year later banned DDT from use in the United States. Carson became identified with “mother nature” and a maternal impulse to manage the natural environment through federal regulation. But this was not the only application of new scientific understanding on environmental policy.

A fascinating dimension was added to the preservation mandate after 1950 when activist Howard Zahniser and others used ecological integrity as an argument to press for the environmental movement’s greatest goal: a national system of wilderness lands. Based on the idealistic notion of pristine wilderness espoused by President Teddy Roosevelt and others, such a system had been called for beginning with Aldo Leopold in the 1910s. With increased recreation in parks and public lands, argued Zahniser, it had become even more crucial that some of the land be set aside completely.

His bill, introduced to Congress in the 1950s, precluded land development and offered recreational opportunities only for a few rather than for the great mass of travelers. Such an ideal goal required great salesmanship, and Zahniser was perfect for the job. As the political climate shifted in the early 1960s, lawmakers became more interested in wilderness. Finally, in 1964, President Lyndon Johnson signed the Wilderness Act into law. The United States had taken one of the most idealistic plunges in the history of environmentalism: nearly ten million acres were immediately set aside as “an area where the earth and its community of life are untrammeled by man, where man himself is a visitor who does not remain.”3 Additional lands would be preserved in similar fashion by the end of the decade.

While the concept of wilderness forced the general American public to begin to understand ecosystems and the webs of reliance operating within natural systems, the application of scientific understanding to environmentalism occurred most often in other realms. Defeating the dam at Echo Park and the passage of the Wilderness Act set the stage for a 1960s shift in environmental thought that combined with the Not in My Backyard (NIMBY) culture of the 1970s to create a federal mandate for policy action. Carson’s ideas were welcomed by a general public more interested than ever in understanding its place in the natural environment, and they demanded that federal law be used to regulate resource use and to ensure a safe living environment.

(p. 434) Regulating Coal in the Environmental Era

After 1960, scientific understanding informed many policies of the modern environmental era, including mining. These restrictions and requirements grew out of additional legislation, particularly the seminal environmental policies of the 1960s and 1970s, including the Multiple Use Sustained Yield Act, Wilderness Act, National Forest Management Act, National Environmental Policy Act, and Federal Land Policy Management Act (FLPMA). To varying degrees each of these policies addressed environmental protection, multiple use, and management of federal land generally. By imposing new requirements on agency actions, and by withdrawing some federal lands from development, these acts have significantly influenced mineral development on public lands.

As the twentieth century progressed, oil became the predominant American energy source in the fossil fuel mix. Importing much of its supply of crude after 1950, the United States grew increasingly unnerved by its petroleum dependence—particularly after the 1973 Organization of Petroleum Exporting Countries (OPEC) embargo. One outcome of this realization was that policymakers emphasized coal as a key domestic energy source that might reduce the United States’ dependence on imported fuels. Because the nation possesses the greatest coal reserves in the world, America could be the “Saudi Arabia of coal.” To encourage use of coal instead of oil to generate electricity after the 1970s, Congress eliminated coal price controls, and the Environmental Protection Agency (EPA) suspended regulations on emissions controls for coal.

However, in the midst of the 1970s energy shocks, members of the United Mine Workers (UMW) participated in an extended strike for better pay, benefits, and working conditions. In light of the serious consequences to the national economy of what amounted to an energy embargo from within the country, President Jimmy Carter intervened and threatened to compel miners to go back to work. However, he was unable to enforce his directive. Eventually, worn down by lack of income after months on strike, the miners agreed to a contract that partially met their demands.

Labor issues played a large part in public debate and policy on mining. After a period of disorganization and serious scandal, new leadership elected in 1972 restored the UMW’s effectiveness in representing miners. At that time, underground miners had the most dangerous jobs in the country. They suffered and died at high rates from accidents, injuries, and respiratory disease. Union efforts, and the impact on Congress of individual mining disasters (such as the deaths in 1968 of sixty-eight miners in an explosion at Farmington, West Virginia), led to the passage of the Mine Health and Safety Act of 1977. This act set improved ventilation and safety standards and required frequent mine inspections. Overall, this legislation helped to make mining significantly safer; however, serious accidents still occur.

Another trend of the 1970s brought together new scientific understanding with the emphasis on Western mining in the form of acid rain. More than 40 percent of the (p. 435) nation’s coal reserves lie in the West, much of it beneath federally owned land. Inevitably, the increasing need for coal drove companies to develop this Western coal, which, for reasons of geology, is lower in heating capacity than the anthracite and bituminous coal of the Appalachian region. In addition, it contains much less sulfur. By the 1960s, scientists had deduced that the sulfur that was released into the atmosphere in coal smoke acidified and returned to Earth in rain. Acid rain has a highly corrosive effect on the environment and it became a very early target of policymakers in their efforts to create environmental regulations. For instance, the requirements of the Clean Air Act of 1970, which called for reduction of harmful emissions, made low-sulfur coal more desirable. The environmental regulations spurred new interest in coal from the Western United States, which could be found fairly near the ground’s surface. Despite the bitter objections of Western ranchers and national environmentalist organizations, strip mining—the removal of surface layers to expose the coal seam—became the predominant method for extracting coal in the West.

The United Mine Workers also were involved in the battle to regulate surface mining. Union leadership was aligned with the coal interests to weaken proposed regulation, favoring resource extraction over environmental concerns, arguing as usual for the importance of economic development. But the majority of rank-and-file union members were deep miners, who of course objected to surface mining because the highly mechanized practice eliminated so many deep mining jobs. However, they also objected because stripping ruined the hunting and fishing, which are very important to the culture of Appalachia. In this interesting case of “labor environmentalism,” conservationists and industrial workers formed an alliance in the early 1960s that linked the United Mine Workers of America, at least temporarily, with groups such as the Allegheny County Sportsmen’s League (of Pennsylvania) to push for tougher controls on surface mining.

The Surface Mining Control and Reclamation Act (SMCRA), the legislative result of so much mining reform effort, was signed into law by President Jimmy Carter in 1977. It was designed to regulate new and active mines as well as address the problem of cleanup after mines were exhausted. Individual states were charged with the task of setting and overseeing mine regulation, subject to compliance with federal standards. However, SMCRA failed to accomplish its objectives. Insufficient bonding requirements and diminishing oversight frustrated the attempt to require mining companies to repair environmental damage, such as returning stripped land to its original contours and mitigating acid runoff from old mines into waterways. Nevertheless, in the fifty years after SMCRA was enacted, the federal government alone spent 6.5 billion dollars on mine reclamation projects.

In recent decades, coal companies have only intensified their extraction practices to keep pace with Americans’ growing demand for energy. In the United States, one hundred tons of coal are extracted every two seconds, and the residual effect can be seen very clearly on the energy landscape of Appalachia and other states as well. Around 70 percent of that coal comes from surface mines, which tear away vegetation and soil to access layers of the earth’s crust. Entire mountains are leveled through mountaintop removal mining (MTR). The environmental impact of this most extreme example of (p. 436) surface mining includes severe soil erosion and depletion, resulting in poor success at revegetation and reforestation, loss of habitat for wildlife, and watercourse sedimentation and outright burial of streams. Current conditions in mining regions confirm the fears and convictions of the activists of the 1960s and 1970s. Poverty and depopulation of the Appalachians is the result of MTR, despite the jobs rhetoric of coal interests. Revenue from mineral extraction went to outside interests that controlled a large percentage of the land; it was not available to build the kind of infrastructure that would have led to a diversified and prosperous economy.

Petroleum Becomes a Global Commodity

New uses for crude drove the expansion of global efforts to develop the commodity. Throughout the twentieth century, large multinational corporations or singular wealthy businessmen attempted to develop supplies and to bring them to market. Massive international companies managed the import and export of oil regardless of the nation of origin. In many cases, the importers—often companies in Western, industrialized nations—were most in control of supply and demand and, therefore, the prices. In the 1960s, though, oil-producing nations would draw from Rockefeller’s model to devise a new structure.

Beginning in 1959, the Eisenhower administration established quotas on the import of crude oil to protect the sale of oil produced in the United States. Quotas infuriated oil-producing countries. By September 14, 1960, a new organization had been formed with which to battle companies that profited by extracting oil around the world. The Organization of the Petroleum Exporting Countries (OPEC) had a single clear intention: to defend the price of oil. OPEC, committed to solidarity among members, would from this point forward insist that companies consult them before altering the price of crude. Although OPEC’s five founding members were the source of over 80 percent of the world’s crude oil exports, its members initially exercised little united political power.

During the coming years, OPEC would gain political clout, in part through American fuel dependence. Between 1948 and 1972, consumption in the United States nearly tripled, from 5.8 million barrels per day to 16.4. Use in other parts of the world consumption increased even more: Western Europe’s use of petroleum increased by 16 times and Japan’s by 137 times. This growth was tied to the automobile. Worldwide, automobile ownership rose from 18.9 million in 1949 to 161 million in 1972; the U.S. portion of this growth was particularly significant, from 4.5 million to 119 million. New technologies enabled some refiners to increase the yields of gasoline, diesel, jet fuel, and heating oil from a barrel of petroleum, but demand remained unlike anything the world had ever seen.

(p. 437) Such reliance on fuel, of which increasing amounts were imported, forced the U.S. federal government to continually review relevant policies. President Richard Nixon announced in 1974 a goal of “energy independence” by 1980. In 1975, President Gerald Ford called for reductions in oil imports of between one and two million barrels a day. Carter pledged that the country would never use more oil than it did in 1977. Subsequent presidents addressed the problem and also promised to reduce dependence on foreign oil. Nevertheless, in the forty years between the Nixon administration and the Obama administration, the percentage of imported oil used in the United States rose from 36.1 percent to 66.2 percent.4

Clearly the longest policy tradition related to energy has been founded in a national interest in development in order to provide an adequate and inexpensive supply of energy for American consumers. However, a separate ethical paradigm helped to shape a second political tradition throughout the twentieth century.

The Politics of Natural Gas: Regulation of Price and Supply

In the 1970s, at the same time that the first oil and gasoline shortages were occurring, another key energy source was in short supply in some parts of the United States: natural gas, which furnished about one-third of the energy used in the United States in the early 1970s. Shortages of gas, especially in the Northeast and Upper Midwest during the cold winters of 1977 and 1978, resulted in significant hardships for residential and industrial customers, as people could not heat their homes, and an estimated two million people were affected by factory closings and layoffs.

Natural gas is closely tied to petroleum. About one-third of the gas used in the United States has been a byproduct of oil extraction. In the early days of oil drilling, the gas produced was “flared” or burnt off as a nuisance. But gas became an important and valued source of clean energy. Burning gas produces little or no soot, smog, or acid rain, and less than half the carbon dioxide of burning coal. Although using gas does have environmental consequences, they are not nearly as severe as the problems of disposal associated with nuclear power plants. Gas requires less refining than petroleum, and drilling disturbs less surface area than coal mining. But gas is more difficult to transport. Because of the expensive and permanent infrastructure required to transport gas—pipelines—this energy source was regulated as a public utility and natural monopoly.

Historically, the primary focus of federal gas law was price and supply control. The 1938 Natural Gas Act gave the Federal Power Commission (FPC) the right to regulate prices that gas companies could charge. Eventually, after the Supreme Court’s 1954 decision in Phillips Petroleum Co. v. Wisconsin, the FPC attempted to regulate gas prices on a complicated regional cost-of-production basis. The details of this regulation kept natural gas prices low enough to encourage demand, but it discouraged new production. (p. 438) Further, it tended to keep gas sales within the areas where it was produced and discouraged interstate sales. The Natural Gas Policy Act of 1978 took steps toward the deregulation of natural gas prices, and in an attempt to balance supply and demand began allowing the forces of a national market to set gas prices at the wellhead. By the mid-1980s, rising gas prices had stimulated exploration and production and dampened demand enough to stabilize the market. After the turn of the twenty-first century, new technologies unlocked enormous new gas reserves and made natural gas again an important subject of national attention.

Energy Dependence in an Era of Limits

As energy supplies became a more significant topic after the 1970s Oil Crisis, each side of the environmental argument staked out its claim on the issue. Environmentalists used the 1973 oil shortage to argue that Americans needed to learn “living within limits.” This lesson would be demonstrated again in 1990 when the nation went to war against Iraq largely to maintain control of oil supplies. Additional concerns came with an increasing awareness of the effects of air pollution and particularly auto emissions’ relationship to global warming. The Clean Air Act of 1991 began a process requiring automobile makers to prioritize increased mileage and also to investigate alternative fuels.

Of course, the argument for a conservation ethic to govern American consumers’ use of energy was a radical departure from the postwar American urge to resist limits and to flaunt the nation’s decadent standard of living. Although this ethical shift did not take over the minds of all Americans in the 1970s, a large segment of the population began to consider a new paradigm of energy accounting. They became interested in energy-saving technologies, such as insulation materials and low-wattage light bulbs. As a product of the 1970s, some Americans were ready and willing to consider less-convenient ideas of power generation such as alternative fuels.

President Jimmy Carter’s administration would be remembered for events such as the Iranian Hostage Crisis; however, when he controlled the agenda he steered American discourse to issues of energy. In a 1977 speech, Carter urged the nation:

Tonight I want to have an unpleasant talk with you about a problem unprecedented in our history. With the exception of preventing war, this is the greatest challenge our country will face during our lifetimes. The energy crisis has not yet overwhelmed us, but it will if we do not act quickly.

It is a problem we will not solve in the next few years, and it is likely to get progressively worse through the rest of this century.

We must not be selfish or timid if we hope to have a decent world for our children and grandchildren.

We simply must balance our demand for energy with our rapidly shrinking resources. By acting now, we can control our future instead of letting the future control us. …

(p. 439) Our decision about energy will test the character of the American people and the ability of the President and the Congress to govern. This difficult effort will be the “moral equivalent of war”—except that we will be uniting our efforts to build and not destroy.5

Carter would introduce wide-reaching policy initiatives mainly aimed at energy conservation. In addition, by authorizing the drilling for oil in Northern Alaska and the construction of a Trans-Alaska pipeline to Valdez in 1977, Carter vowed to also preserve the remaining Alaskan wilderness. From 1978 to 1980, President Jimmy Carter established the Arctic National Wildlife Refuge (ANWR) and set aside 28 percent of Alaska as wilderness. Carter announced this watershed legislation as follows: “We have the imagination and the will as a people to both develop our last great natural frontier and also preserve its priceless beauty for our children.” With the passage of the Alaska National Interest Lands Conservation Act (ANILCA) in 1980, the BLM was ordered to oversee 11 million acres of Alaska as wilderness, including the ANWR. The symbolic role of Alaska for environmentalists and friends of wilderness was clear to Carter. Although he offered a clear vision of our limited future based on extracted energy resources, by the 1980s many Americans were returning to business as usual.

By the close of the twentieth century, most observers had admitted that changes needed to be made in energy management in the United States. The primary debate grew from what role the federal government would play in any new structure. Environmentalists and those prioritizing conservation argued for Carter-esque federal stimulants to force auto producers and electricity producers to develop alternative technologies for generation. The energy industry, with the backing of the George W. Bush administration, argued for federally aided growth in existing supplies. In the National Energy Policy put forward by the George W. Bush administration in 2001, the mandate was for construction of new power plants using existing fossil fuels (primarily coal and natural gas), emphasis on nuclear technologies, and an effort to harvest existing U.S. petroleum supplies. Public debate grew from the latter point, which specifically called for the drilling of oil in the ANWR. It is a debate that continues today.

Fuel Conservation and Ensuring American Automobility

Given the significant problems associated with emissions generated by vehicles, common sense, then, follows that every effort be made to increase automobiles’ efficiency. The Corporate Average Fuel Economy (CAFE) program, which started in 1978, was intended to help reduce American dependence on foreign oil by producing more fuel-efficient vehicles. The program requires each automaker to meet government-set mileage targets (CAFE standards) for all its car and light truck fleets sold in the United (p. 440) States each year. The complex program requires automakers to calculate the fuel economy of all vehicles actually sold. It is not a calculation of what automakers offer for sale, but what consumers buy.

When the Arab oil embargo prompted crude oil prices to triple in the mid-1970s, Americans clamored for relief at the pump and policymakers turned their attention to automobile fuel efficiency. In 1975, Congress passed the Energy Policy and Conservation Act, requiring automakers for the first time to build and sell vehicles that met fuel economy standards. It would quickly become one of the most far-reaching and complex regulations ever placed on the industry, affecting everything from product mix, design, and safety to decisions about where to locate the plants that build them.

The 1978 CAFE standards required 18 miles per gallon (mpg) for cars. The standard increased each year until 1985 when it reached the current 27.5 mpg. Light truck standards were set at 17.2 mpg for the 1979 model year and are currently 20.7 mpg. The standard does not require each vehicle to achieve the standard, but rather requires the sales-weighted average fuel economy of the fleet of cars and trucks the manufacturer sells to achieve the standard. This has made it possible for manufacturers to continue to produce inefficient vehicles such as Hummers if they offset it with enough other fuel-efficient products to allow the overall fleet to meet the CAFE standards.

One other initiative that began in the 1970s was a federally mandated speed limit. In the 1970s, federal safety and fuel conservation measures included a national speed limit of 55 miles per hour. Today, consumers have led states to loosen such restrictions; however, concern over fuel conservation continues.

The Era of Environmental Protection and Backlash

Energy development was only one aspect of the significant changes in environmental policy after the 1960s. The main thrust of this legislation revolved around a change in American expectations: safety and health of the general public was as important as economic development. The public outcry would be so severe that even a conservative such as Richard Nixon might be deemed “the environmental President” as he signed the National Environmental Protection Act in 1969, creating the Environmental Protection Agency (EPA). The public entrusted the EPA as its environmental regulator to enforce ensuing legislation monitoring air and water purity, limiting noise and other kinds of pollution, and monitoring species in order to discern which required federal protection. The public soon realized just how great the stakes were.

During the 1970s, oil spills, river fires, nuclear accidents, and petroleum shortages made it appear as if nature were in open rebellion. In short, nearly every industrial process was seen to have environmental costs associated with it. From chemicals to atomic power, technological “fixes” came to have long-term impacts in the form of wastes and residue. Synthetic chemicals, for instance, were long thought to be advantageous because they (p. 441) resist biological deterioration. In the 1970s, this inability to deteriorate made chemical and toxic waste the bane of many communities near industrial or dump sites.

Among the assorted catastrophes, Love Canal stood out as a new model for federal action. The connection between health and environmental hazards became obvious throughout the nation, and it spurred action such as the establishment of “Superfund” sites in which federal dollars would be used to clean up the nation’s worst toxic sites. Scientists were able to connect radiation, pollution, and toxic waste to a variety of human ailments. The “smoking gun,” of course, contributed to a new stage of litigation in environmentalism. Legal battles armed with scientific data provided individuals armed with only NIMBY convictions with the ability to take on the largest corporations in the nation.

Rapidly, this decade instructed Americans, already possessing a growing environmental sensibility, that humans—just as Carson had instructed—needed to live within limits. A watershed shift in human consciousness could be witnessed in the popular culture as green philosophies infiltrated companies wishing to create products that appealed to the public’s environmental priority. Recycling, daylight savings time, carpooling, and environmental impact statements became part of everyday life after the 1970s.

Earth Day 1970 suggested to millions of Americans that environmental concerns could be expressed locally. Through organized activities, many Americans found that they could actively improve the environment with their own hands. Many communities responded by organizing ongoing efforts to alter wasteful patterns. Recycling, mandated by the first legislation in Oregon in 1972, has proven to be the most persistent of these grassroots efforts. Though the effort is often trivialized by more extreme environmentalists, the recycling of trash and waste now stands as the ultimate symbol of the American environmental consciousness. Local and state initiatives for recycling mark one of the most basic ways that “green” ideas have been institutionalized.

By the 1970s and 1980s, environmental concerns had emerged as a major player in local and federal politics. As such, federal regulation and policy initiatives would vary with the political winds of the time. Each president has the ability to alter the intensity with which federal regulation of environmental factors is carried out.

The primary example of such fluctuation is the presidency of Ronald Reagan, who was first elected in 1980. Reagan and his successor George H. W. Bush each demanded that the EPA become not an advocate agency but a “neutral broker” that might actually foster development. Picking up on the “Sagebrush Rebellion” staged in a few Western states reacting to FLPMA during the 1970s, Reagan advanced his pro-industry and antiregulation policies by installing James Watt as Secretary of the Interior and Anne Gorsuch as head of the EPA. Watt and Gorsuch had worked together in Colorado to dismantle federal policies controlling the use of federal lands. Personnel in the EPA dropped by 25 percent and its budget was sliced by more than half. Policies and limitations were put in place to limit the number of lawsuits that could be brought and render most other cases inactive.

Earth Day 1990 continued environmental traditions of the past, but also marked an important change in environmentalism’s scope. Worldwide, fourteen nations participated (p. 442) in this celebration. While a global perspective seemed inherent in the web of life put forward by Rachel Carson and others, it would take global issues such as the Chernobyl nuclear accident in 1986 and shared problems such as greenhouse gases and global warming to bind the world into a common perspective. Organizations, including Greenpeace, assisted members from many nations to shape a common stand on issues.

The United Nations presented the leading tool for facilitating global environmental efforts. With its first meeting on the environment in 1972, the global organization created its Environmental Program. This organization would sponsor the historic Rio Conference on the Environment in 1992 and the Conference on Global Warming in 1997. In response to such activities, the U.S. federal government declared the environment a genuine diplomatic risk in global affairs by creating a State Department Undersecretary for the Environment in 1996.

New scientific abilities enabled researchers to see the impact of specific portions of economic development, specifically the impact of global warming and the depletion of the protective ozone layer surrounding the Earth. The result was a watershed international agreement referred to as the Kyoto Protocol. Such legislation would place caps on emissions from all nations and thereby possibly limit or alter industrial development.

The embattled election of 2000 became a referendum on environmental thought. How serious a priority were Americans willing to make the environment? The Green Party and its candidate Ralph Nader offered voters the opportunity to make the environment the main priority of voting. More importantly, though, Nader pulled many liberal votes from the Democrat, Al Gore. Gore, author of the environmental treatise Earth in the Balance, argued a moderate environmental line, including a path-breaking energy policy based on conservation and the development of alternative fuels. The Republican, George W. Bush, offered the defederalizing ideas that many anti-environmentalists had supported during the Reagan era. While Bush’s victory did not hinge entirely on these stances on the environment, his efforts in office proved a test for the environmental establishment.

Prioritizing corporate cooperation with federal environmental agencies, Bush echoed many sentiments of Reagan. Corporate leaders were placed in control of a variety of government environmental agencies, and efforts were made to backtrack on many environmental developments of the last decade. Most important, Bush confronted energy shortages with a strategy for growth and further development of fossil fuels. This plan, of course, included drilling for natural gas and petroleum in many federally owned areas, including the ANWR. Additionally, Bush rejected the Kyoto Protocol on global warming and openly questioned the very existence of this ecological problem.

Conclusion: Political Fluctuation and Debate

After a century of consistent strengthening of the role of federal policy in regulating energy development and the use of natural resources, the twenty-first century has witnessed a consistent litany of conflicts over the extent of this mandate. Although many (p. 443) global issues brought nations together on matters of the environment, acting locally proved to be terrain fraught with legal fights and political disagreements. In short, by the end of the twentieth century, the nature of Americans’ view of the environment had altered to the point where there truly was an additional paradigm to economic development. Regulations and laws required that many additional issues be considered in determining a project’s economic viability. Efforts of modern environmentalists to tie this ethic into the nation’s legal framework altered the way Americans could live; however, it also set the stage for an era of ongoing political conflict.

For instance, energy development was impacted by this political dichotomy when price increases and new horizontal drilling technology enabled the development of shale gas and oil through “fracking” or fracturing. In states such as Texas and Pennsylvania, energy companies were able to proceed almost unfettered in their efforts to frack for natural gas. By contrast, New York passed state and local laws to forbid or limit such development. Similarly, the debate over the construction of the Keystone/XL Pipeline galvanized each perspective and forced political standoffs between a Republican-controlled Congress and President Barack Obama.

Clearly, though, climate change represents the single most significant flashpoint. As climate change emerged in the early twenty-first century as one of the most debated environmental issues, federal initiatives waxed and waned with the political winds. Initiatives such as “cap and trade” represented efforts to insert the federal government into a role of stimulating utilities and others to move away from a reliance on coal and other fossil fuels. Although most of these initiatives failed, by 2015 it was clear that the reality of climate change was no longer under debate, and mitigation efforts took the form of policy on many different fronts.

Clearly, the policies to manage and regulate energy resources and the environment will continue to reflect broader priorities and preferences.

Bibliography

Andrews, Richard N. L. Managing the Environment, Managing Ourselves: A History of American Environmental Policy. 2nd ed. New Haven: Yale University Press, 2006.Find this resource:

Andrews, Thomas. Killing for Coal: America’s Deadliest Labor War. Cambridge, MA: Harvard University Press, 2010.Find this resource:

(p. 444) Black, Brian. Crude Reality: Petroleum in World History. Lantham, MD: Rowman & Littlefield, 2014.Find this resource:

Burns, Shirley S. Bringing Down the Mountains: The Impact of Mountaintop Removal on Southern West Virginia Communities, Morgantown: West Virginia University Press, 2007.Find this resource:

Carstensen, Vernon. The Public Lands: Studies in the History of the Public Domain. Madison: University of Wisconsin Press, 1963.Find this resource:

Freese, Barbara. Coal: A Human History. Cambridge, MA: Perseus, 2003.Find this resource:

Freudenburg, William R., and Robert Gramling. Blowout in the Gulf: The BP Oil Spill Disaster and the Future of Energy in America. Cambridge, MA: MIT Press, 2011.Find this resource:

Gold, Russell. The Boom: How Fracking Ignited the American Energy Revolution and Changed the World. New York: Simon & Schuster, 2014.Find this resource:

Graetz, Michael. The End of Energy: The Unmaking of America’s Environment, Security, and Independence. Cambridge, MA: MIT Press, 2011.Find this resource:

Hays, Samuel P. Beauty, Health, and Permanence: Environmental Politics in the United States, 1955–85. New York: Cambridge University Press, 1993.Find this resource:

Henderson, Henry L., and David B. Woolner, eds. FDR and the Environment. New York: Palgrave, 2004.Find this resource:

Jackson, Kenneth T. Crabgrass Frontier. New York: Oxford University Press, 1985.Find this resource:

LeCain, Timothy J. Mass Destruction: The Men and Giant Mines That Wired America and Scarred the Planet. New Brunswick, NJ: Rutgers University Press, 2009.Find this resource:

Lewis, Tom. Divided Highways. Ithaca, NY: Cornell University Press, 2013.Find this resource:

Maher, Neil. New Deal Nature: The Civilian Conservation Corps and Roots of American Environmentalism. New York: Oxford University Press, 2009.Find this resource:

McNeil, John R. Something New under the Sun: An Environmental History of the Twentieth-Century World. New York: Norton, 2001.Find this resource:

Montrie, Chad. To Save the Land and People: A History of Opposition to Coal Surface Mining in Appalachia. Chapel Hill: University of North Carolina Press, 2003.Find this resource:

Robbins, R. M. Our Landed Heritage: The Public Domain, 1776–1970. Lincoln: University of Nebraska Press, 1976.Find this resource:

Sutter, Paul S. Driven Wild: How the Fight against Automobiles Launched the Modern Wilderness Movement. Seattle: University of Washington Press, 2002.Find this resource:

Tarr, Joel A. The Search for the Ultimate Sink: Urban Pollution in Historical Perspective. Akron, OH: University of Akron Press, 1996.Find this resource:

Worster, Donald. Dust Bowl: The Southern Plains in the 1930s. New York: Oxford University Press, 1979.Find this resource:

Worster, Donald. Nature’s Economy. New York: Cambridge University Press, 1994.Find this resource:

Yergin, Daniel. The Prize: The Epic Quest for Oil, Money and Power. New York: Simon & Schuster, 1992.Find this resource:

Notes:

(1.) Richard Andrews, Managing the Environment, Managining Ourselves: A History of Environmental Policy (New Haven, CT, 2001), 161–162.

(2.) Franklin Roosevelt, “First Inaugural Address of Franklin Roosevelt, March 4, 1933, http://avalon.law.yale.edu/20th_century/froos1.asp

(3.) The Wilderness Act, September 3, 1964, https://www.wilderness.net/nwps/legisact

(4.) William R. Freudenburg and Robert Gramling, Blowhout in the Gulf: The BP Oil ill Disaster and the Future of Energy in America (Cambridge, MA, 2003), 211.

(5.) Jimmy Carter, Address to the Nation on Energy, April 18, 1977, https://millercenter.org/the-presidency/presidential-speeches/april-18-1977-address-nation-energy.