The Building of an Epistemological Edifice: 20 Decades of Inquiry Into Our Changing Climate

Eric Schupper
40 min readAug 3, 2016

--

Source: http://climate.nasa.gov/climate_resources/24/

In a recent piece, Washington Post Energy and Environment writer Chris Mooney marked the 30th anniversary of the seminal U.S. Senate hearings on “Ozone Depletion, the Greenhouse Effect, and Climate Change” convened by Senate Environment and Public Works Committee member John Chafee (R-R.I.) in June 1986. In Senator Chafee’s opening statement to the Subcommittee on Environmental Pollution, he stated the reason for focusing attention on these issues:

We are doing so because there is a very real possibility that man — through ignorance or indifference, or both — is irreversibly altering the ability of our atmosphere to perform basic life support functions for the planet. … This is not a matter of Chicken Little telling us the sky is falling. The scientific evidence … is telling us we have a problem, a serious problem.

In addition to quoting from Senator Chafee’s statement, Mooney in his piece also quotes Senator Albert Gore, Jr. (D-TN), who had been elected to the Senate in the 1984 election after serving four terms in the House of Representatives: “There is no longer any significant difference of opinion within the scientific community about the fact that the greenhouse effect is real and already occurring.” Senator Gore was not a member of the Environment and Public Works Committee, but made his statement as a witness invited to testify at the hearings. Gore was already learned on the subject of climate and concerned about it. Senator Chafee thanked Gore for his efforts in focusing congressional attention on the “greenhouse effect” problem.

The aim of Mooney’s piece is to highlight how “eerily familiar” warnings 30 years ago about the impacts on global temperature and climate of rising levels of atmospheric carbon dioxide and other greenhouse gases (GHGs) are to such warnings today.

It’s true: The gist of the warnings about rising GHG emissions and global warming hasn’t changed much since the 1980s. Decades of climate research enabled the scientific community in the 1980s to warn that we have a serious problem. Today the warnings are essentially the same, except they are more urgent and dire; they are substantiated by an even larger mountain of empirical scientific research and observation and far more powerful and “skillful” (to borrow Dr. Gavin Schmidt’s terminology) computer models; and the economic impacts of climate change, both those already occurring and those we may experience in future, are being quantified with increasing detail and sophistication by economists, companies, and groups like the Risky Business Project.

How big is the empirical mountain of climate evidence? The climate-related changes to the physical and biological systems described by the Intergovernmental Panel on Climate Change (IPCC), for example, are based on a review of more than 29,000 data series. There are more data series in the metaphorical mountain of climate science evidence than there are feet in the real Mount Everest’s height above sea level. And ninety-plus percent of those data series point in the direction of human-caused warming.

That last bit — that 90-plus per cent of data series point in the same direction — is key. The consistent convergence of multiple lines of evidence toward the same conclusion is known as consilience. Consilience is what makes climate change science so robust and resilient. Vox.com writer David Roberts explained in a December 2015 piece:

The strength of consilience science does not issue from the validity of any one set of measurements or any one line of evidence. It’s not vulnerable to wholesale refutation if anomalies are found in one data set or another. Even if individual lines of evidence are weak or uncertain, their convergence, the fact that many trails keep leading to the same place, can make consilience science strong.

Consilience is what gives rise to the consensus of dozens of scientific organizations and associations around the world and more than 97% of climate scientists (see also here) that global warming is occurring, it’s caused by human activities, it’s changing Earth’s climate, and it represents a myriad of very real global risks to humans and natural systems. It’s what enables the hundreds of scientists from around the globe involved in the IPCC review process to agree to state with 95% confidence (likely a conservative value — see, e.g., here) that the warming observed over the past several decades would not have occurred but for the dominant influence of human activities. The verity of Senator Gore’s words 30 years ago, that “There is no longer any significant difference of opinion within the scientific community about the fact that the greenhouse effect is real and already occurring,” is today supported by an even more robust epistemological buttress and even greater consensus of scientific opinion.

Mooney notes later in his piece, “the fundamental understanding of the greenhouse effect, and that carbon dioxide is a greenhouse gas because of its particular properties, dates back to the 19th century … No wonder, then, that there was so much that scientists could say about it in 1986.” It was beyond the scope — and probably the word limit — of his piece to flesh out that history more fully. But indeed, even in 1986 global warming science was a mature area of scientific inquiry with respect to many of the fundamentals. The concept of changes to Earth’s temperature and climate caused by increasing GHG concentrations was not invented by the IPCC when it was established in 1988 by the World Meteorological Organization (WMO) and the United Nations Environment Program (UNEP) under the auspices of the United Nations, two years after the 1986 Senate hearings. It was not a concept invented by Al Gore in his 2006 movie An Inconvenient Truth, twenty years after those hearings. Global warming wasn’t invented by the Chinese to … whatever [insert ridiculous conspiracy theory]. It’s not a hoax. The evidence for carbon dioxide’s role in influencing global climate in the past, present, and future does not issue solely from computer models or the IPCC (the IPCC reviews the state of the science every several years; it does not conduct original research), as this video brilliantly illustrates.

Nor did climate used to be such a politicized issue, as evidenced by the Republicans over the years, from Sen. Chafee in the 1980s to both presidents Bush to once upon a time Lindsey Graham, John McCain, Newt Gingrich, and Mitt Romney, all of whom believed global warming was a real phenomenon and a risk warranting serious attention.

Not that anyone’s belief makes climate change real any more than anyone’s disbelief makes it unreal. It is occurring as proved by science. Decades of it. Its verity does not derive from opinion, but from evidence. As Neil deGrasse Tyson has said (emphasis added):

Once science has been established, once a scientific truth emerges from a consensus of experiments and observations, it is the way of the world. … [W]hen different experiments give you the same result, it is no longer subject to your opinion. That’s the good thing about science: It’s true whether or not you believe in it. That’s why it works.

It is important to know there is a robust and long history of scientific inquiry on which our understanding of climate rests. Not important necessarily to know all the complexities of the science or the multitude of disciplines involved in the study of Earth’s climate system. It’s not even reasonable to expect individual climate scientists to know it all, let alone lay persons. But important to know that scientific inquiry into understanding Earth’s climate system has been going on a very long time, and that the scientific community’s projections and warnings about anthropogenic climate change issue from a robust epistemological foundation, contrary to what climate science deniers assert. Important to know Earth’s climate system is among the more thoroughly investigated areas in all of modern science. To reiterate, upwards of 29,000 data series from thousands of peer-reviewed papers were reviewed by the IPCC in the course of drafting its latest Assessment Report detailing the physical science basis of what we know about climate change. Important to know multiple lines of evidence all converge to the same conclusion regarding its human causes. Climate deniers, on the other hand, writes Michael Shermer, have been unable to “show a consistent convergence of evidence toward a different theory that explains the data.” And given that climate researchers seek to understand a phenomenon of planetary scale, Earth’s climate system, and their inquiry entails dozens of scientific disciplines, it’s important to know climate science is in its scope among the grandest areas of scientific inquiry.

It is important to know this long, rich history of climate science exists because, as Senator Chafee observed 30 years ago, it reveals that we have a serious problem. Knowing that there is a long, rich history of science informing our understanding of the problem, even if one doesn’t know the fine details about the science that built our understanding over time, can inoculate one against believing the bullshit claims and dismissive assertions of climate deniers. That too few politicians, voters, and commentators seem to know just how rich climate science history is and how robust the epistemological basis of our understanding of the climate change problem is results in repeated utterances of rank stupidity — e.g., that global warming is a hoax or a recent Chinese invention to … whatever. Worse, this persistent ignorance for too long caused us to dither in taking serious, concerted actions to address the problem.

So while it may have been beyond the scope of Mooney’s piece to include greater detail on the scientific history that informed the Senate climate hearings in 1986 and that continues to inform our understanding today, following are some highlights of the decades-long history that forms the epistemological basis of what was known then, what we know now, and how we know it.

1820s

In 1986 the scientific endeavor to understand Earth’s climate system, particularly the Greenhouse Effect produced by the heat-trapping capacity of CO2 and other greenhouse gases, spanned 17 decades. Today it’s 20. Twenty! That’s right, the mountain of science informing what we know accumulated over the course of two centuries dating back to when James Monroe, the fifth U.S. president, occupied the White House.

That mountain’s foundational roots trace to 1820s post-Napoleon France when Jean Baptiste Joseph Fourier, who had conducted science and engineering projects for the late emperor, first reasoned that the atmosphere must be contributing to the maintenance of Earth’s temperature at a warmer level than should be expected given our planet’s distance from the Sun. In effect, and without so naming it, Fourier discovered the Greenhouse Effect.

1850s-1860s

A few decades later, in the 1850s and ’60s, Irish naturalist John Tyndall investigated the radiative properties of gases and experimentally demonstrated that CO2 and water vapor absorb infrared radiation — i.e., they trap heat in the atmosphere. The simple laboratory demonstration in this video shows CO2 absorbing heat:

The atmospheric physics we know today concerning the radiative forcing potential of CO2 and other GHGs traces its roots to Tyndall’s original work. We know radiation from the Sun across the electromagnetic spectrum enters the atmosphere and warms the Earth. Earth radiates some of that warmth back to space as heat in the infrared spectrum. But CO2, water vapor, and other GHGs block some of that escaping infrared radiation and re-radiate it back to Earth, thereby keeping average temperatures at the surface habitably warm. The higher the concentration of GHGs, the more infrared radiation — heat — is blocked from escaping to space and the greater the energy imbalance in the climate system grows and the warmer the average surface temperature becomes.

Tyndall’s research gave credence to Fourier’s Greenhouse Effect idea. Tyndall suggested that changes in the concentrations of GHGs could result in sufficient changes to climate that might explain the advance and retreat of Northern Hemisphere ice sheets. Today we know the glacial cycles are the product of both changes in GHG concentrations and the metronomic, multi-millennial cyclic changes in Earth’s orbit, wobble, and axial tilt (collectively known as Milankovitch Cycles).

1890s-1930s

In the 1890s, Swedish scientist Svante Arrhenius made the first calculations of global warming from a doubling of atmospheric CO2 concentration. They were crude but not far off from the upper end of the range projected based on what we understand today. In 1931, American physicist E.O. Hulbert made similar calculations and, factoring in the additional water vapor a warmer atmosphere would contain, arrived at a temperature increase of approximately 4°C for a doubling of atmospheric CO2 concentration.

Hulbert’s 4°C estimate fell within the range an ad hoc U.S. National Academy of Sciences (NAS) study group would calculate almost 50 years later in a scientific assessment report (the Charney report) on carbon dioxide and climate issued in 1979. The NAS group, led by M.I.T. physicist and meteorologist Jule Charney, calculated a 3°C ± 1.5°C temperature rise from a doubling of CO2 concentration. That’s consistent with today’s estimates of climate sensitivity. The 2013 IPCC Fifth Assessment Report (pdf) stated, “Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C (high confidence), extremely unlikely less than 1°C (high confidence), and very unlikely greater than 6°C (medium confidence).”

Later in the 1930s English engineer Guy Stewart Callendar undertook to collect and analyze weather data that had been compiled by meteorologists around the world. Based on temperature measurements in those data, he observed a rise in average global temperature since the late 19th century of close to half a degree Celsius (0.5°C, or 0.9°F). He suggested it was caused by industrial CO2 emissions. In other words, Callendar intimated that human-caused global warming was already occurring and observable.

1950s -1960s

In the 1950s, concern was growing about the observable, measurable rise in industrial emissions. Remember, this was the post-WWII era before the widespread adoption of clean air legislation. Photos from that time, such as these of 1940s Pittsburgh, depict how bad air pollution was in places.

In 1955, Austrian born American physical chemist and nuclear physicist Hans Suess identified the isotopic signature of industrial-based CO2 emissions. Suess’s carbon fingerprinting allowed us to recognize that the excess CO2 accumulating in the atmosphere was not coming from present natural sources, but from ancient carbon released through burning fossil hydrocarbons.

Canadian-born physicist and Johns Hopkins University professor Gilbert Plass was a pioneering researcher on the effects of solar and infrared radiation on atmospheric temperature and climate. He published a series of papers in the mid-1950s, perhaps the best known of which is The Carbon Dioxide Theory of Climate Change (1956), examining radiative fluxes and the role of CO2. Although Plass’s focus was narrowly on atmospheric CO2 and not a holistic analysis encompassing other feedbacks, he calculated that average surface temperature would warm 3.6°C for a doubling of CO2 concentration, and that CO2 levels would increase 30% over the 20th century with a warming of 1°C over the same period. Despite some errors in his calculations (that went both ways and effectively canceled out), in the main Plass was correct; his predictions are consistent with the IPCC’s latest climate sensitivity numbers. (For a critique of Plass’s climate sensitivity analysis and predictions, see, e.g., here).

By the late 1950s, what science was revealing about the potential global impacts on Earth’s climate of the byproducts of industrialization and modernization was becoming known beyond scientific circles. Take this clip from Frank Capra’s 1958 The Unchained Goddess, for example.

Though the film was released almost 60 years ago, its implicit warning is as “eerily familiar” today as the explicit warnings Chris Mooney highlights from the 1986 Senate climate hearings.

The growing concern about rising industrial emissions led young Scripps Institution of Oceanography scientist Charles David Keeling in 1958 to begin collecting atmospheric CO2 samples at Hawaii’s Mauna Loa Observatory. He produced data in 1960 and 1961 showing the annual variation in CO2 levels driven by the seasonal cycle of photosynthesis and respiration of plants in the Northern Hemisphere. By the mid-1960s his measurements showed atmospheric CO2 levels steadily rising over time.

Charles Keeling would assiduously continue his CO2 monitoring work at Mauna Loa over the following decades of his career, work that earned him the National Medal of Science, the highest U.S. award for lifetime scientific achievement. President George W. Bush presented Keeling with the award at a White House ceremony in 2002. Charles Keeling died in 2005, but the measuring project at Mauna Loa continues to this day under the supervision of his son Ralph, a Scripps professor like his father.

The Keelings’ observations at Mauna Loa give us the “Keeling Curve,” the longest continuous record of directly measured atmospheric CO2 levels and one of the most important and well-known graphs in climate science.

The Keeling Curve shows CO2 levels in the mid-level troposphere rising steadily from 315 parts per million in 1958 to over 400 ppm today. Today’s level, 405.37 ppm as of July 2016, is 44% higher than the 280 ppm that existed prior to the start of the Industrial Revolution. We are nearly halfway to a doubling of atmospheric CO2 above the pre-Industrial level.

The CO2 concentration continues to rise at an alarming rate, as does global average surface temperature. The 3.05 ppm increase in 2015 was the largest year-to-year jump since the Mauna Loa measurements began, even as global energy sector carbon emissions have flattened since 2013 at just over 32 billion metric tons according to International Energy Agency data. I wrote about that here.

The President Acknowledges the Carbon Emissions Problem

On February 8, 1965, twenty-one years before the 1986 Senate hearings on global warming convened by Senator Chafee, President Lyndon Johnson delivered a Special Message to Congress in which he warned of the dangers of carbon dioxide pollution from fossil fuel burning:

Air pollution is no longer confined to isolated places. This generation has altered the composition of the atmosphere on a global scale through … a steady increase in carbon dioxide from the burning of fossil fuels.

It is the first time a U.S. president publicly acknowledges global warming and the problem of continuing to pollute the atmosphere with carbon emissions.

President Johnson received advice on the issue from some of the most prominent climate scientists of the day. They included Roger Revelle, Wallace Broecker, the aforementioned Charles Keeling, Harmon Craig, and Joseph Smagorisnky. In a chapter on atmospheric carbon dioxide in a President’s Science Advisory Committee (PSAC) report issued in November later that year, Revelle and his colleagues presciently wrote:

Through his worldwide industrial civilization, Man is unwittingly conducting a vast geophysical experiment. Within a few generations he is burning the fossil fuels that slowly accumulated in the earth over the past 500 million years. … By the year 2000 the increase in atmospheric CO2 … may be sufficient to produce measurable and perhaps marked changes in climate, and will almost certainly cause significant changes in the temperature and other properties of the stratosphere. … The climatic changes that may be produced by the increased CO2 content could be deleterious from the point of view of human beings.

They concluded “with fair assurance that at the present time, fossil fuels are the only source of CO2 being added to the ocean-atmosphere-biosphere system.” Revelle et al. estimated the concentration of atmospheric CO2 could rise between 14% and 30% by the year 2000. They were correct: in 2000 it had risen by 15.5% and is more than 25% higher today than in 1965.

Wallace Broecker, co-author of the PSAC report, would author a paper in 1975 entitled Climate Change: Are We on the Brink of a Pronounced Global Warming? that is credited with coining the term “global warming.” Broecker employed a computer model to predict future global temperature changes. Although the model he used was crude and simple, its prediction has fairly closely matched the global surface temperature changes observed since 1975. (See, e.g., here).

1970s and Global … Cooling? Yes, That Was A Thing

By the end of the 1960s and into the early 1970s, scientists studying global temperature trends observed a slight decline in global average temperature since 1940, mainly in the Northern Hemisphere. It was also observed in the early 1970s that atmospheric concentrations of soot and aerosols, such as sulfur dioxide (SO2), had been increasing rapidly over that same period. The cause of the rapid rise in soot and aerosol concentrations was the acceleration of industrialization and modernization that began during World War II. Some of those aerosols reflect incoming solar radiation, and the slight temperature decline observed between 1940 and 1970 was attributed in part to their rise.

In the later years of the 1960s, the first computer models were developed to simulate Earth’s climate. Scientists began employing those models to help understand the relative roles of aerosol cooling and GHG warming. There was uncertainty about the net impacts on temperature of those opposing “forcings”. A handful of scientists concluded the continued build-up of aerosols might have a counteractive effect to rising GHG levels and might even produce global cooling. It was a small handful indeed: only 7 papers in the peer-reviewed literature between 1965 and 1979 predicted cooling. (See, e.g., here).

Those predictions were based on assumptions about aerosol concentrations continuing to rise. But in the late 1960s and early ’70s, countries, including the United States, enacted clean air policies to address the problem of worsening air pollution. As a result, aerosol concentrations began to decline near the end of the 1970s, thus undercutting one of the key assumptions on which the small number of cooling predictions had been predicated.

Among the early pioneers in utilizing computer models to simulate Earth’s climate was the late Stephen Schneider. Schneider was one of the scientists who predicted cooling. In a 1971 paper he calculated that aerosols would cool more than CO2 would warm and that cooling was more likely. However, as Schneider himself recounted in his 2009 book, Science as a Contact Sport, and in a 2010 lecture at Stanford University, he discovered soon after co-authoring that 1971 paper why he was wrong and published “before anyone else did what was wrong with my own calculations — something I’m more proud of than anything I’ve ever done. That’s what we do in science.”

Just beginning his career in 1971, Schneider’s work over the following decades would make him a venerated figure in the scientific community focused on climate research and modeling. He became not just one of the world’s foremost experts on Earth system science and a contributor to the Fourth Assessment Report for which IPCC shared the 2007 Nobel Peace Prize with Al Gore, but also an exemplar of science communication, especially on climate. Climate One bestows an annual award for outstanding climate science communication that bears Schneider’s name. And the American Geophysical Union features the Stephen Schneider Memorial Lecture at its annual meeting.

The “Global Cooling” Zombie

The idea that “global cooling” might be occurring and ushering the world toward a new ice age was obviously an intriguing one. Not surprisingly, it got picked up by media at the time, notably in a 1974 Time magazine piece and a 1975 Newsweek article. Not to knock the reporting itself, but those stories are a source of some irritation today. Although the cooling predictions were made by only a small handful of scientists based on assumptions we learned decades ago were incorrect; although the larger scientific consensus in the 1970s was on GHG-induced warming (see, e.g., here and here) and is overwhelmingly so today; although the warming trend resumed in the early 1970s following the slight stall between 1940 and 1970, with each decade since the 1950s warmer than its predecessor; and although the “global cooling” and impending “Ice Age” ideas essentially died almost 40 years ago when in 1978 James Hansen and colleagues, with the aid of computer models, sorted out that GHG warming was the dominant forcing over aerosol cooling (see, e.g., here); still, like zombies that just won’t die and stay dead, those media stories from the early 1970s continue to be resurrected today by climate-denying Dr. Frankensteins to suggest what the climate science community says about climate is unreliable and that therefore global warming isn’t an issue to be taken seriously. Armed with the knowledge of the history of the emergent science of that time, however, when you hear those “global cooling” shibboleths trotted out, you can accord them and those who utter them their proper deference — None.

The volume of climate research grew through the 1970s, and scientific opinion was converging on greenhouse warming as the largest climate risk in coming decades. Advances in atmospheric science and computing power enabled scientists and modelers to develop better climate simulations. It was with the aid of such early models — though they were rudimentary compared to today’s — that, as noted above, Hansen and colleagues in 1978 resolved the question of the relative forcing potentials of GHGs and aerosols, in effect putting to rest the global cooling worry. Hansen input climate data from the 1963 eruption of Bali’s Mount Agung to determine whether his model would accurately recreate the observed effects — a technique known as hindcasting. Hansen’s study concluded, “the magnitude, sign, and time delay of the temperature changes … for both the stratosphere and troposphere are in excellent agreement with those of the temperature changes observed after … the eruption of Mount Agung.” In other words, the modeled simulation closely matched observed reality.

Sticking with Hansen to skip ahead to the 1980s for a moment: Whereas the 1978 study was a hindcast, Hansen co-authored a paper in 1981 that forecast global warming over the coming century. Hansen et al. wrote: “It is shown that the anthropogenic carbon dioxide warming should emerge from the noise level of natural climate variability by the end of the century, and there is a high probability of warming in the 1980s.” Based on the model calculations, the authors projected warming over the following century of ~2.5°C, a temperature exceeding the temperature during the “previous interglacial period 125,000 years ago, and would approach the warmth of the Mesozoic, the age of dinosaurs.” They predicted warming at high latitudes to be much greater than the global mean; disintegration of the West Antarctic ice sheet; and opening of the Northwest Passage.

Warming in the northern latitudes has in fact been greater than the global average (see video below); the West Antarctic ice sheet recently was confirmed to be on an irreversible course to collapse; and the Northwest Passage has opened multiple times. So, as was the case with Hansen’s 1978 model and Broecker’s 1975 model, once again the predictions based on modeled simulations have been borne out by real-world observations.

Today, as Goddard Institute of Space Studies Director Dr. Gavin Schmidt explains, the capability of computer models to simulate the geospatial and temporal physics of Earth’s climate grows by an order of magnitude each decade. One order of magnitude in the spatial dimension represents “10,000 times more calculations.” The models have become quite “skillful” at simulating long-term global surface temperature changes. (See, e.g., here).

By the late 1970s, concern about the world’s growing population and increasing per capita energy use heightened the scientific community’s attention on rising CO2 levels and their influence on climate. In 1977, Roger Revelle chaired a National Research Council panel which issued a report, Energy and Climate, that called for an intensified program of research on CO2. Reflecting the growing consensus view on CO2 and global warming, in 1978 Robert White, the first administrator of the National Oceanic and Atmospheric Administration (NOAA) and later president of the National Academy of Engineering, said:

We now understand that industrial wastes, such as carbon dioxide released during the burning of fossil fuels, can have consequences for climate that pose a considerable threat to future society.

Three reports on the CO2/climate issue were issued in 1979. Concern about the issue’s implications for national and international policy planning led the Office of Science and Technology Policy (OSTP) to request the National Academy of Sciences to conduct an independent critical assessment of the state of the science and the degree of certainty that could be attached to the conclusions of the many CO2/climate investigations and modeling efforts. The product was the aforementioned Charney report, Carbon Dioxide and Climate: A Scientific Assessment.

In the Charney report’s Foreword, Verner Suomi, Chairman of the Climate Research Board, stated:

We now have incontrovertible evidence that the atmosphere is indeed changing and that we ourselves contribute to that change. Atmospheric concentrations of carbon dioxide are steadily increasing, and these changes are linked with man’s use of fossil fuels and the exploitation of the land.

Based on their review of the available evidence, Charney et al. estimated the most probable average global temperature rise from a doubling of CO2 concentration would be 3°C ± 1.5°C.

Earlier in 1979, a lesser known report arrived at a similar estimate. A group of physicists from the JASON defense advisory panel led by Gordon MacDonald authored a report entitled The Long Term Impact of Atmospheric Carbon Dioxide on Climate. Based on models* the group constructed, they arrived at an average surface temperature increase of between 2°C and 3°C for a doubling of CO2 and warming at the poles of up to 10°C to 12°C. They predicted we might reach a doubling of CO2 by the mid-2030s. (*Climate science specialists reportedly were unconvinced by JASON’s “little physics project” and criticized its model as too simple. See, e.g. here.)

Four decades later, the temperature estimates calculated by the Charney and JASON groups hold up. The 2013 IPCC Fifth Assessment Report (pdf) expressed with “high confidence” that “Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C.”

Also in 1979, the first “World Climate Conference” (WCC-1) was held. Organized by the WMO as “a world conference of experts on climate and mankind,” the conference was attended by some 350 specialists from 53 countries and 24 international organizations and from a range of disciplines including agriculture, biology, ecology, economics, energy, environment, fisheries, medicine, sociology, and water resources. WCC-1 expressed concern that “continued expansion of man’s activities on Earth may cause significant extended regional and even global changes of climate.” It called on governments of the world “to foresee and prevent potential man-made changes in climate that might be adverse to the well-being of humanity,” and for “global cooperation to explore the possible future course of global climate and to take this new understanding into account in planning for the future development of human society.”

Even Exxon Knew

Because of intrepid investigative reporting by Pulitzer Prize-winning InsideClimate News, we now know that even oil giant ExxonMobil (then Exxon Corporation) knew as far back as the late 1970s the verity of global warming caused by CO2 emissions from fossil fuel use and the threat continued fossil fuel burning represented. The company’s own scientists appear to have been at the cutting edge of climate research. Based on InsideClimate News’ reporting, it is fair to count Exxon’s climate researchers among the growing majority of scientists who shared the consensus view at the time on human-caused greenhouse warming.

In their blockbuster September 2015 article, InsideClimate News describes a presentation delivered in July 1977 to Exxon management by one of its senior scientists:

At a meeting in Exxon Corporation’s headquarters, a senior company scientist named James F. Black addressed an audience of powerful oilmen. Speaking without a text as he flipped through detailed slides, Black delivered a sobering message: carbon dioxide from the world’s use of fossil fuels would warm the planet and could eventually endanger humanity.

“In the first place, there is general scientific agreement that the most likely manner in which mankind is influencing the global climate is through carbon dioxide release from the burning of fossil fuels,” Black told Exxon’s Management Committee, according to a written version he recorded later.

[…]

A year later, Black, a top technical expert in Exxon’s Research & Engineering division, … warned Exxon scientists and managers that independent researchers estimated a doubling of the carbon dioxide (CO2) concentration in the atmosphere would increase average global temperatures by 2 to 3 degrees Celsius (4 to 5 degrees Fahrenheit), and as much as 10 degrees Celsius (18 degrees Fahrenheit) at the poles.

(Note the similarity of the numbers Black cites to the JASON panel’s numbers.)

Exxon for a time took seriously what Black had said. The company budgeted annual funds for internal research aimed at assessing the impacts of carbon emissions. Its scientists also collaborated with outside researchers and modelers (emphasis added):

Exxon also hired scientists and mathematicians to develop better climate models and publish research results in peer-reviewed journals. By 1982, the company’s own scientists, collaborating with outside researchers, created rigorous climate models — computer programs that simulate the workings of the climate to assess the impact of emissions on global temperatures. They confirmed an emerging scientific consensus that warming could be even worse than Black had warned five years earlier.

Exxon could have chosen to continue funding internal cutting edge climate research and modeling. Its stable of scientists and mathematicians could have continued to make valuable contributions to the epistemological foundation undergirding our understanding of CO2 and climate. It could have chosen to be a leader on efforts to identify and develop alternatives to fossil fuels that could have begun almost 40 years ago to wean the world of its petroleum addiction. Imagine how different our energy systems and transportation technologies might be today if Exxon had done those things; if it had led.

Instead, like Anakin Skywalker turning away from the Jedi order to become Darth Vader, Exxon opted for the path of the Dark Side. It became the Private Empire:

Then, toward the end of the 1980s, Exxon curtailed its carbon dioxide research. In the decades that followed, Exxon worked instead at the forefront of climate denial. It put its muscle behind efforts to manufacture doubt about the reality of global warming its own scientists had once confirmed. It lobbied to block federal and international action to control greenhouse gas emissions. It helped to erect a vast edifice of misinformation that stands to this day.

On InsideClimate News’ revelations about what Exxon knew and the company’s subsequent years-long efforts to sow doubt about the verity of the science and lobby against policy action to address GHG emissions, several state attorneys general have launched investigations into Exxon’s conduct — conduct that may constitute fraud and violate securities laws for misleading shareholders and investors. The U.S. Department of Justice has directed the FBI to investigate whether Exxon is susceptible to legal action under the federal Racketeering Influenced and Corrupt Organizations Act. RICO provides civil and criminal penalties for members of organizations engaged in patterns of criminal activity and illegal conspiracy. (As a lawyer who once worked on a RICO case, I will be surprised if both civil and criminal cases aren’t brought and don’t ultimately succeed.)

1980–1986

Up until the 1980s, the issue of rising CO2 levels and their impacts on climate had largely remained confined to the arena of scientific inquiry. Around the start of the 1980s, it became increasingly salient in political/policy circles. The growing volume of research and the late 1970s reports by the NRC, NAS, and WCC-1 led to legislation, the 1980 Energy Policy Act, calling on the NAS to assess the issue of rising CO2 levels and their impacts. The product of that request was Changing Climate: Report of the Carbon Dioxide Assessment Committee (1983), a massive 500-page report by the NAS that was the most comprehensive review of the state of the science on the effects of human-caused increases in the levels of atmospheric CO2 and other GHGs published to that date. In the Synthesis chapter, the report states:

When it is assumed that CO2 content of the atmosphere is doubled and statistical thermal equilibrium is achieved, all models predict global surface warming, [and] [n]one of the calculations … predicts negligible warming. Calculations with the three-dimensional, time-dependent models of global atmospheric circulation indicate global warming due to a doubling of CO2 from 300 ppm to 600 ppm to be in the range between about 1.5 and 4.5°C. … Simpler models that appear to contain the main physical factors give similar results. … Warming at equilibrium would be 2–3 times as great over the polar regions as over the tropics; warming would probably be significantly greater over the Arctic than over the Antarctic.

Those predictions have been largely consistent with measurements and observations over the period from the early 1980s to the present. The 1.5°C to 4.5°C temperature change from a doubling of CO2 NAS predicted in 1983 is identical to the range the IPCC expressed in the 2013 Fifth Assessment Report (“Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C (high confidence), extremely unlikely less than 1°C (high confidence), and very unlikely greater than 6°C (medium confidence)).”

In 1985 UNEP, WMO, and the International Council for Science (ICSU) jointly convened a conference held in Villach, Austria on the “Assessment of the Role of Carbon Dioxide and of Other Greenhouse Gases in Climate Variations and Associated Impacts.” The report of the Villach conference concluded that, “as a result of the increasing greenhouse gases it is now believed that in the first half of the next century (21st century) a rise of global mean temperature could occur which is greater than in any man’s history.” Senator Gore referenced the Villach conference in his testimony at the 1986 Senate climate hearings.

The preceding overview highlights but some of the landmark and emergent developments in the 160-year history of scientific inquiry into understanding the role of CO2 and GHGs on global temperature and climate that informed the Senate climate hearings in 1986. (For a more detailed summary of the history of climate science up to the early 1980s, see Annex I of the Changing Climate report). Even this non-exhaustive overview illustrates the solid epistemological footing on which the scientific community stood at the time in describing what was known about the CO2/climate issue and in warning about the potential risks to human and natural systems of disrupting the climate system through the continued emission of GHGs and other human activities.

Yet, these highlights are only the tip of the iceberg. Over the course of sixteen decades, hundreds of some of the 19th and 20th centuries’ foremost scientists investigated various aspects of Earth’s climate system, producing a veritable library’s worth of peer-reviewed scientific literature. The 9,200 peer-reviewed studies cited in the most recent IPCC scientific assessment illustrate how expansive that library has become.

On an editorial note, one of the details that stands out in this historical overview is the remarkable consistency of scientists’ various calculations and project ions over time. From Hulbert’s temperature calculations in the 1930s to Plass’s temperature calculations in the 1950s to Hansen’s temperature calculations in the late 1970s and 1980s to the latest 2013 IPCC Fifth Assessment Report’s sensitivity estimate, the figures have consistently fallen within a range between about 2°C and 5°C for a doubling of atmospheric CO2. Even as atmospheric and Earth system science advanced and various uncertainties and potential feedbacks were identified, the fact that the calculations remained consistent indicates that, from the earliest days of inquiry into the impact of CO2 on climate, scientists largely understood the basic mechanisms. Further, the fact that observations — e.g., average surface temperature change, greater high-latitude warming, increased frequency and intensity of extreme weather events, etc. — have consistently and closely matched model predictions demonstrates the “skillfulness” of the models.

The History Post-1986

Since the 1986 Senate climate hearings, climate change science and modeling have continued to advance, improving and refining our understanding of the climate system and how and to what extent human activities are disrupting it. The research enterprise to understand not just the mechanisms governing Earth’s climate system but also the impacts climate change will have on natural and human systems, as well as how we might mitigate the causes of those impacts and adapt to them, burgeoned to include scientists and experts outside the realm of the physical sciences. Climate change is a multidimensional and multidisciplinary problem; in the 1980s it became widely recognized and treated as such.

The information from across scores of disciplines that has emerged over the last 30 years is too voluminous to document in a blog post whose intent is to give an historical overview of the epistemological basis on which our understanding of human-caused global warming and climate change rests. It is also unnecessary: The history since the 1980s of our increasingly multidisciplinary analysis and understanding of anthropogenic climate change and its impacts is encapsulated in the IPCC Assessment Reports. The reports are presented in parts, with sections focused on the physical science of climate change; the environmental, economic, and social impacts of climate change; and response strategies to mitigate and adapt to climate change.

This piece will continue to focus on the science. I will limit the remainder of the piece to James Hansen’s 1988 congressional testimony, the IPCC reports, and Mann et al.’s “hockey stick” graph.

Hansen’s 1988 Testimony to Congress

In 1988, two years after the 1986 Senate climate hearings, James Hansen, who since 1981 had been Director of NASA’s Goddard Institute for Space Studies (GISS), gave his landmark testimony before the Senate Energy and Natural Resources Committee. He testified: “Global warming has reached a level such that we can ascribe with a high degree of confidence a cause and effect relationship between the greenhouse effect and observed warming. It is already happening now.”

By 1988 Hansen had long since established himself as one of the foremost experts in climate science and modeling. His testimony made news headlines because it was the first time a climate scientist publicly proclaimed that the observed global warming trend was, with 99% certainty, due to man’s activities and not natural variability. Hansen’s stature as the Director of NASA-GISS and a recognized leading expert in climate science gave his statement considerable heft. What scientists had been predicting for many years was, according to one of the world’s leading climate experts, now a reality: Humans, through burning fossil fuels and other activities, had nudged Earth’s climate outside its natural bounds.

The IPCC Reports

By the late 1980s, Hansen’s work and that of the scores of his contemporary and predecessor scientists had raised the salience of human-caused climate change beyond the sphere of the physical sciences. Recognizing that climate change would have a myriad of long-term impacts on human and natural systems, scientists and experts from the social, political, decision-analytical, and other disciplines expanded the scope of inquiry into the many dimensions of the issue. Policymakers, too, began increasingly to reckon with the policy and governance implications of global warming and its impacts.

The rising worldwide recognition of the global warming problem led in 1988 to the establishment by UNEP and WMO of the Intergovernmental Panel on Climate Change. The IPCC was given the mandate to assess the state of existing knowledge about the climate system and climate change; the environmental, economic, and social impacts of climate change; and the possible response strategies. In fulfilling that mandate over the nearly 30 years since it was established, the IPCC has issued five Assessment Reports. For its Fourth Assessment Report the IPCC shared the 2007 Nobel Peace Prize with Al Gore “for their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change.”

The IPCC Assessment Reports are the most thorough, comprehensive assessments of existing climate change research.

The First Assessment Report was issued in 1990. One hundred seventy-five scientists from 25 countries contributed to the preparation of the main Scientific Assessment. Another 200 scientists peer-reviewed the draft report.

The Executive Summary of the Policymakers Summary of the Working Group I Scientific Assessment Report states:

· We are certain of the following: there is a natural greenhouse effect, … emissions resulting from human activities are substantially increasing the atmospheric concentrations of the greenhouse gases carbon dioxide, methane, chlorofluorocarbons (CFCs) and nitrous oxide. These increases will enhance the greenhouse effect, resulting on average in additional warming of the Earth’s surface. The main greenhouse gas, water vapour, will increase in response to global warming and further enhance it.

· We calculate with confidence that: … Carbon dioxide has been responsible for over half the enhanced greenhouse effect in the past, and is likely to remain so in the future. … Continued emissions of these gases at present rates would commit us to increased concentrations for centuries ahead. … [T]he long-lived gases would require immediate reductions in emissions from human activities of over 60% to stabilize their concentrations at today’s levels.

· Based on current model results, we predict: under the Business-As-Usual emissions [scenario], a rate of increase of global mean temperature during the [21st] century of about 0.3°C per decade (with an uncertainty range of 0.2°C to 0.5°C per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1°C above the present value by 2025 and 3°C before the end of the [21st] century.

· There are many uncertainties in our predictions particularly with regard to the timing, magnitude and regional patterns of climate change, due to our incomplete understanding of: sources and sinks of [GHGs], clouds, oceans, [and] polar ice sheets.

· Our judgment is that: global mean surface temperature has increased by 0 3°C to 0 6°C over the last 100 years; … The size of this warming is broadly consistent with predictions of climate models, but it is also of the same magnitude as natural climate variability. Thus, the observed increase could be largely due to this natural variability; alternatively this variability and other human factors could have offset a still larger human-induced greenhouse warming.

The IPCC’s statements attributing climate change to human activities grew progressively stronger with each successive report as on-going research by thousands of scientists from around the globe advanced our knowledge and understanding. Table 1 illustrates the increased confidence expressed in each iteration of the IPCC reports.

Table 1. How IPCC’s Conclusions Have Strengthened Over Time

(Hat tip to: http://www.c2es.org/science-impacts/ipcc-summaries/growing-certainty)

The Fifth Assessment Report (AR5) was released in September 2013. The full first section of the report, Climate Change 2013: The Physical Science Basis, is over 2,000 pages and cites more than 9,200 scientific studies. Its drafting entailed 259 lead authors and review editors from 39 countries. An additional 600 experts from 32 countries were invited by the Lead Authors to provide specific knowledge or expertise in a given area as Contributing Authors.

AR5 contains the strongest language to date, stating it is “extremely likely” [a greater than 95% probability] that human activities are “the dominant cause of the observed warming” since the 1950s. The Third Assessment (2001) made a similar statement with approximately 66% certainty. The Fourth Assessment Report (AR4) (2007) found that “most of the observed increase in global average temperatures since the mid-20th century is very likely [90 percent confidence] due to the observed increase in anthropogenic greenhouse gas concentrations.”

There is a subtle but important difference between AR4’s and AR5’s language. Compare the above-quoted language from AR4 to the following statement from AR5 (emphasis added):

It is extremely likely [95 percent confidence] more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in [GHG] concentrations and other anthropogenic forcings.

Whereas the 2007 AR4 statement focused only on anthropogenic GHG emissions, the 2013 AR5 statement references all human influences on climate. AR5 includes the cooling influence of anthropogenic aerosol emissions. Aerosols produce a cooling effect that offsets roughly 1/3 of the warming produced by GHGs. Even factoring in aerosols’ cooling influence, AR5 concludes that human drivers are the main cause of the warming observed over the past 60-plus years.

But AR5 goes further in explaining that the cause of recent global warming is anthropogenic GHGs and not external or internal natural influences such as changes in solar activity (external) or ocean cycles (internal).

The best estimate of the human-induced contribution to warming is similar to the observed warming over this period … The observed warming since 1951 can be attributed to the different natural and anthropogenic drivers and their contributions can now be quantified. Greenhouse gases contributed a global mean surface warming likely to be in the range of 0.5°C to 1.3°C over the period 1951−2010, with the contributions from other anthropogenic forcings, including the cooling effect of aerosols, likely to be in the range of −0.6°C to 0.1°C. … The contribution from natural forcings is likely to be in the range of −0.1°C to 0.1°C, and from internal variability is likely to be in the range of −0.1°C to 0.1°C.

As you can see, the contribution from natural forcings and internal variability is effectively zero. (Hat tip to this piece by Dana Nuccitelli in The Guardian)

The fact that each successive IPCC Assessment Report stated more strongly the conclusion that human activities are causing the observed global warming is one of the key take-aways of the IPCC scientific assessments. Also important to understand is that the Assessment Reports are consensus statements. First, all participating scientists in the Working Group must agree on the language and the level of confidence that can be ascribed to the statements. This makes the statements inherently conservative, as the statements reflect what is acceptable to the most cautious scientist(s). Second, the language must be agreed to by the delegates of the governmental parties to the United Nations Framework Convention on Climate Change (UNFCCC). In light of this conservative bent in the IPCC’s reports, one should appreciate the gravity of such a numerous group of scientists and governmental delegates agreeing to express confidence at the 95% level. For such a large number of scientists to agree to issue a statement at the 95% confidence level is about as close as science gets to saying “Yeah, we’re sure about this.”

The Hockey Stick

(Mann 1999)

The final scientific development this piece highlights is the so-called “hockey stick” graph. In the late 1990s, climate researchers Michael Mann, Raymond Bradley, and Malcom Hughes reconstructed Northern Hemisphere temperatures over the past thousand years using tree rings, ice cores, corals, boreholes, lake-bottom sediments, and other records that act as proxies for atmospheric surface temperature. As the graph above illustrates, their paleoclimate reconstruction shows temperatures declining slightly but steadily over the past millennium up to the beginning of 20th century. Around the turn of the 20th century, temperatures take a dramatic turn upward. The graph calls to mind a hockey stick’s long handle and sharply angled blade; hence the “hockey stick” label.

Mann et al.’s paleoclimate reconstruction shows the Northern Hemisphere warming since the beginning of the 20th century is unprecedented in the past thousand years. It suggests that but for the interruption of the long-term declining temperature trend by modern industrial carbon emissions, we would be on a long, slow slide over the next thousand to few thousand years into the next glacial period. Our understanding of Milankovitch cycles and historic climate ebbs and flows also suggests we should be nearing the inception of the next ice age. (See e.g., here). But human activities, as reflected in the blade of the hockey stick graph, almost certainly have postponed that inception. (See e.g., here). In fact, recent research suggests the abnormal warming brought on by humans may cause Earth to skip an entire glacial cycle, delaying the next ice age for 100,000 years or more.

The hockey stick graph’s depiction of a millennium-long temperature decline followed by the dramatic 20th century upturn is about as impactful as a visual representation of physical reality gets. Not surprisingly, the hockey stick graph, like the Keeling Curve, became and remains one of the more significant and well-known graphs in climate science. It featured prominently in the IPCC’s 2001 scientific assessment (AR3). AR3 stated:

It is likely that the rate and duration of the warming of the 20th century is larger than any other time during the last 1,000 years. The 1990s are likely to have been the warmest decade of the millennium in the Northern Hemisphere, and 1998 is likely to have been the warmest year.

Because the abrupt temperature uptick the hockey stick depicts is coincident with fossil hydrocarbon-based industrialization, it should also not be surprising that the paleoclimate reconstruction became a source of controversy. Not within scientific circles, but within political circles as politicians mostly if not entirely from the American political right attacked the paleoclimate reconstruction and its original authors. Various business-funded groups also attacked the study and its authors. But as interesting and, frankly, maddening, as the story of that controversy is, it isn’t germane here. If you are interested in the hockey stick controversy and the years-long imbroglio in the center of which Professor Mann found himself, read his book on the matter, The Hockey Stick and the Climate Wars.

The significance of Mann et al.’s original paleoclimate research and its prominence in climate science prompted further investigation and review. In an exhaustive, independent analysis published in 2006, the National Academy of Sciences affirmed Mann et al.’s findings. The NAS stated:

The basic conclusion of Mann et al. (1998, 1999) was that the late 20th century warmth in the Northern Hemisphere was unprecedented during at least the last 1,000 years. This conclusion has subsequently been supported by an array of evidence that includes both additional large-scale surface temperature reconstructions and pronounced changes in a variety of local proxy indicators, such as melting on ice caps and the retreat of glaciers around the world.

Indeed, as the graph below illustrates, more than a dozen subsequent, independent studies affirmed and strengthened Mann et al.’s key finding. The NAS report further stated: “Surface temperature reconstructions for periods prior to the industrial era are only one of multiple lines of evidence supporting the conclusion that climatic warming is occurring in response to human activities, and they are not the primary evidence.” (The “multiple lines of evidence” language harkens us back to the discussion of consilience earlier in this piece.)

(Mann 2008)

In 2013, the PAGES 2k project published the findings of its undertaking to reconstruct global average temperature over the past 2,000 years using proxy indicators. It, too, confirmed the original hockey stick. (See e.g., here). In nearly all of PAGES 2k’s regional temperature reconstructions, the researchers found a long-term cooling trend that ended at the turn of the 20th century and was followed by abrupt warming over the 20th century.

The affirmation of Mann et al.’s original work by subsequent independent research exemplifies how and why science works.

Conclusion

This overview of the 20-decade history of scientific inquiry into understanding Earth’s climate and how humans are changing it reveals the iterative process of science. It illustrates how focused study over many years builds an epistemological edifice of knowledge and understanding.

The science that informed the Senate hearings 30 years ago holds up today. The science that informed President Johnson’s message to Congress 50 years ago holds up well today. Even the earlier observations and predictions offered by Callendar, Hulbert, and Arrhenius hold up quite well today. Climate science is recognized as a consilience science because multiple lines of evidence supported by thousands of data series gathered over many decades point to the same conclusion, while no alternative theory explains the totality of the data. A great many climate fundamentals are understood well enough to give rise to the scientific community’s consensus that global warming is occurring, human activities are causing it, and it poses serious global risks to human and natural systems. The consensus view is significant because it arises from the massive body of evidence that has accumulated over two centuries.

Today, it is indisputable that global warming is occurring. It is a measurable phenomenon, and we have measured it. Each successive decade since the 1950s has been warmer than the prior one. Warming global temperatures are disrupting Earth’s climate. (For those unclear about the terms “global warming” and “climate change,” climate change is a function of global warming). Increasingly we see the impacts of climate disruption. They are increasingly evident and growing in severity. We see temperature records shattered. Fourteen of the 15 warmest years on record have occurred in the 21st century. 2014 was the hottest year in the instrumental record, and probably the past thousand years at least, until 2015 exceeded it. 2016 is on pace to beat 2015 by a significant margin. The energy building up in the atmosphere and oceans is driving more powerful weather events. The warmer atmosphere — currently about 1°C warmer than the mid-20th century average — holds more water vapor, which results in more severe and anomalous precipitation and drought events. The science is telling us these trends will continue beyond the lives of even the youngest humans alive today even if we could reduce carbon emissions to zero instantaneously right now. The science is telling us — e.g., here — that what humans have done in the geologically short span of 200 years will alter the course of Earth’s geologic future for 100,000 years or more.

It is true that aspects of the climate system are not well-understood and remain the subject of scientific investigation. But the fact that unknowns exist — for example, exactly when certain tipping points will be reached and what the magnitude of their impacts, synergies, and feedbacks will be on climate — does not cause the epistemological foundation of our understanding of the overall climate picture to crumble. In Stephen Schneider’s Stanford lecture referenced earlier, Schneider mentions a “preponderance needle.” Schneider explains that the overwhelming preponderance of scientific evidence points to human activities as the cause of global warming, and that no one uncertainty or weak data point, or even a handful of uncertainties or weak data points, is sufficient to move the preponderance needle to point to a different conclusion.

When someone rejects man-made climate change, or refers to it as a hoax or a “religion,” or doubts the scientific community’s consensus, or questions the integrity of climate scientists, he is revealing that he hasn’t bothered to make a good faith attempt actually to learn even the basics of the science and is unaware of the long history of the scientific endeavor to understand Earth’s climate and climate change. We know from the work of researchers like Yale’s Dan Kahan and others that what lay people say in response to polling questions about climate change reveals more about their ideology and political identity than it does about their understanding of climate science specifically or their level of science literacy generally. It is an on-going communications challenge to overcome peoples’ biases and ideological predispositions so that we can effectively raise their understanding of climate change and the myriad risks it poses.

While the overwhelming preponderance of scientific evidence has been telling us for decades that we have a serious problem, unfortunately policy-making has largely failed to keep pace with what science has been revealing about global warming risks. That policymakers not just in the U.S. but around the world, as well, failed for so many years — decades — to enact policies to address in a serious way the global warming problem may fairly be regarded as the greatest risk management failure humans have ever committed. I intend in a forthcoming piece to explain the decision-analytical error that lies behind the cause of this risk management failure.

To conclude on a positive note, in December 2015 at the 21st Conference of the Parties (COP-21) to the UNFCCC held in Paris, France, the nations of the world, after more than two decades of trying, finally reached an accord to address climate change. The 195 countries gathered at COP-21 committed to “holding the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change.” It is true that the sum of the individual commitments countries made in the Paris Accord is inadequate to achieve the Accord’s goals. But the groundwork has been laid and steps to decarbonize the global economy are already underway that may enable us to avoid some of the worst-case scenarios of global warming. There is cause for cautious hope. And we owe a debt of gratitude to the thousands of scientists who worked over the course of nearly two centuries building the epistemological edifice which informs our understanding of Earth’s climate system, climate change, and the risks human-caused global warming poses so that we may reckon with the problem and endeavor to address it.

--

--

Eric Schupper

#ClimateHawk | Science Enthusiast | Lawyer | UNC, Harvard, Duke Law alum: Environmental Science, Policy, & Management; Public Health; Law.