Statistical Evidence for the Anthropogenic Causes of Recent Global Temperature Increase

Ben Goertzel, Nejc Znidar, Matt Ikle’, Scheherazade Goertzel

Matt Iklé
SingularityNET
17 min readJan 29, 2020

--

DECENTRALIZED AI FOR A HEALTHY PLANET: SingularityNET was founded with the goal of creating a global decentralized AI platform, and using it to create powerful Artificial General Intelligence and militate toward benevolent aims.

The “social good” aspect of the organization’s mission has been manifested, since the earliest days, in R&D work by the SingularityNET and Mozi.ai teams on the application of AI to the biology of human longevity. A more recent R&D thrust uses SingularityNET AI to advocate toward a healthy planet and global environmental sustainability.

This blog post is the first in what will be a series reporting applications of decentralized AI to sustainability and environmental science.

TLDR

Current climate simulation models are valuable and insightful but highly complicated with numerous parameters, making it complex to use them for assessment of the causal impact of various anthropogenic or natural factors on global temperature increase. So we have sought to complement the simulation approach with an investigation using statistical and machine learning prediction tools.

As the first step in this direction, we have utilized Granger Causality, a well-known method at the intersection of statistics and AI, to estimate the impact of various factors on temperature increase over the last 130 years, via direct data analytics without simulation modeling.

It turns out that, indeed, the combination of well-mixed greenhouse gas emissions, tropospheric aerosols, ozone emissions and changes in land-use patterns has had a significant causal impact on global land and sea temperatures during this time period, and even more significantly so in more recent decades.

We are now following up this preliminary work with more in-depth work leveraging AI tools to carry out predictive modelling of global temperature increase based on multivariate nonlinear time series analysis.

Where We Are Currently

The reality of global temperature increase is becoming increasingly obvious to nearly everyone who lives on this planet — as is the severe difficulty of nudging the world economy toward actions with significant odds of slowing this increase.

The situation is pushing some people to extremes, leading to a rash of apocalyptic predictions that overstate the situation for dramatic effect. From a scientific perspective, it’s hard to support statements like “The world is going to end in 12 years if we don’t address climate change” (Alexandria Ocasio-Cortez) or “Around 2030 we will be in a position to set off an irreversible chain reaction beyond human control that will lead to the end of our civilization as we know it” (Greta Thunberg).

And yet it’s easy to see why concerned individuals might choose such exaggerated phrasings. Year on year, decade on decade, we see more glaciers melting, more species going extinct due to human impacts, and those with the power to solve these problems mostly don’t seem to care that much. Even if the results of global temperature increase aren’t going to be truly apocalyptic but merely extremely damaging, it’s arguably our species should be doing a lot more about them.

It’s not like society is ignoring the problem entirely. New industries, such as electric cars and renewable energy farms, are emerging with the aim to make our modern lifestyle more sustainable. But ambitious as these efforts are, they are somewhat off to the side of the main global economic engine … and it’s becoming clear lately that — at least according to our best current scientific understanding — the present level of effort isn’t going to be enough to dramatically slow the warming process.

It’s possible that, given the realities of the world political and economic order, our best approach to managing climate change will involve aggressive pursuit and application of advanced technologies. Bearing in mind the dictum, variously attributed to Einstein and Ram Dass, that “The significant problems we have cannot be solved at the same level of thinking with which we created them,” one may suppose that a technological approach to climate change probably requires technologies radically more advanced and qualitatively different from the industrial-era technologies that have created the problem.

Geo-engineering solutions are one conceivable approach, though obviously with significant risks along with the exciting potential rewards. It is also interesting to explore how artificial intelligence technology may apply.

Very broadly, there are two sorts of applications of AI to climate change: first, to help us understand what is going on with the global climate; and secondly, to help us do something about it. At SingularityNET we are working along both vectors.

Regarding “doing something about it”, for instance, a future blog post here will cover the use of AI to estimate the amount of carbon sequestered in a certain farm or forest from aerial photographs, (which is useful, among other purposes, for inexpensively determining the amount of carbon credits a farm or forest owner should receive).

The present blog post describes some first steps in the “understand what’s going on” direction.

In SingularityNET’s Hong Kong headquarters, as you read these words, an effort is underway to use OpenCog’s MOSES machine learning technology to learn accurate predictive models forecasting global temperature increase based on variables describing relevant human activities. The present post summarizes some preliminary work that has been done to pave the way for these efforts in predictive modeling — aimed at understanding which combinations of variables have a significant causal impact on global temperature, and hence should be considered as input features for machine learning-based predictive models.

Food For Thought for Genuine Skeptics of Anthropogenic Global Warming

As well as providing guidance for our machine learning work on climate change, it is our hope that these simple investigations using Granger causality may help slightly to defuse some of the controversies that still remains surrounding the climate crisis. Many people and political entities remain in defiance of the current consensus on global climate change. The USA, for example, recently exited the Paris Climate Accord, through which many other countries, both developing and developed, agreed to lower their emissions and make their policies more eco-friendly. In those cases where “climate change skepticism” represents genuine skepticism rather than ideological bias or economic opportunism, the investigations given here may have some value for bridging the gap between skeptics and the majority view.

A good example of a genuine skeptical view is that of the (now departed) legendary physicist Freeman Dyson. Dyson did not doubt the reality of recent global temperature increase, nor did he doubt that greenhouse gas emissions have played some role in it. However, he felt very skeptical of climate simulation models, arguing that their parameters tend to be overfitted to historical data and that they leave out many aspects of the environment known to be critical, such as the complex impacts of biological organisms on climate evolution. James Lovelock, the creator of the “Gaia hypothesis” (which describes the ecosystem as a sort of global bio-chemical intelligent organism), who for a time was extraordinarily concerned about climate change as extinction risk, more recently became skeptical of modern climate simulation models for similar reasons.

The overfitting of aspects of climate models to historical data is hard to deny: Typically, to get a climate model to accurately explain data over a certain historical period, the researcher needs to tinker with the parameters of the model based on observed simulation results over that period. This is just fine for exploratory research, but it’s not correct statistical methodology if one’s aiming to do accurate future prediction.

A recent meta-analysis shows that many historical climate simulation models would have given reasonably accurate predictions of global temperature increase year on year if they had been fed accurate data of factors such as CO2 emissions year on year. Many of the incorrect predictions published based on these models were wrong due to pessimistic assumptions about future human activities (e.g. overestimating future CO2 emissions), rather than due to problems in the simulation models themselves.

For instance, in 1970 if one wanted to use a simulation model to predict global temperature going forward to 2007, one needed to use not only the model but also some assumptions about CO2 emissions and other human activities every year from 1970 to 2007. Feeding wrong assumptions about future human activities into a reasonably accurate simulation model can still yield terrible predictions. Re-running models from 1970 using historically accurate data about human activities from 1970–2007, often yields much more accurate predictions during this period.

This same meta-analysis, however, also reveals that more modern climate simulation models often yield less accurate predictions than older ones. This is concerning and is very likely due to the increased complexity of the more recent models. Adding more parameters without commensurately more data to tune the parameters, is going to result in increased overfitting and thus decreased predictive accuracy.

To avoid the limitations and difficulties inherent in predicting based on complex simulations models in the presence of relatively scant data, we decided to take more of a direct number-crunching approach to exploring the key question on which climate-change skeptics differ from the scientific mainstream: The importance of anthropogenic factors for causing global temperature increase.

Instead of using simulation models, in the work described here, we used a statistical hypothesis test called multivariate Granger Causality, which determines whether a certain set of time series significantly causes a certain time series to change. In other words, we used it to look for a cause-and-effect relationship between various possibly-relevant variables — in this case, variables representing different climate affecting factors (called “forcings” in climate-science lingo) — and the time series of global temperature increase.

Previous climate studies have used this method before, but in a more limited way — basically just looking at one force or two variations on the same forcing. Our aim here was to look for the causal impact of multiple forcings considered together. AI tools are especially good at finding patterns combining multiple input variables, so if such combined-factor causality exists here, that’s a good indication that further in-depth work using AI techniques may be valuable.

Our idea here was: If anthropogenic factors really played a major role in causing global temperature increase during the last century or so, these statistical techniques should be able to at least find some hint of this causality. Advanced machine learning tools might be needed to understand more about the causal pathways, but statistical methods like Granger Causality should give some first clues, which can then be followed up with subtler AI analysis.

The Data

We searched NASA’s databanks regarding factors that affect the global climate and found seven main influences with information on how much they heat the Earth per square meter:

Three natural influences:

  • Solar Irradiance
    How much heat the Earth receives from the sun
  • Stratospheric Aerosols
    Often sulphuric acid and water, largely produced by volcanoes
  • Orbital Forcings
    Alterations in the Earth’s orbit shape and the tilt of the Earth’s axis

And four anthropogenic (human-caused) influences:

  • Well-Mixed Greenhouse Gases
    Carbon dioxide, methane, and nitrous oxide, etc.
  • Changes in Land Use
    Mainly the conversion of land from natural vegetation, such as forest, to farmland or pastureland
  • Tropospheric Aerosols
    Sulfate, nitrate, sea salt, black carbon, etc., and their influence both directly and as clouds
  • Ozone
    O3, naturally and as a manmade production in the stratosphere and troposphere — Tropospheric ozone is usually caused by the combination of sunlight and anthropogenic emissions

By incorporating all of these factors, we looked at many more human and natural forcings than previous Granger Causality studies. And we also considered the influence that these individual variables have when acting jointly, which is important in order to get a dynamic understanding of what specific combinations of actions might be causing a spiked increase in global temperature such as has been seen in recent years.

Climate data itself has its own accuracy issues, so to make ourselves more confident of our findings, we used two different types of climate data and analyzed two separate time periods. It is important to include detailed climate data from as early as we can, but early climate data is less accurate than more recently measured climate data, so we looked at these two time periods separately: 1880 to 2012 and 1958 to 2012.

Looking at the data between 1880 and 1958, we find that there were often no proper tools to accurately measure forcings and temperatures, which means that the data is often approximative. The data we used is what we found from NASA; many researchers worked hard at making it as accurate as possible, and it’s commonly used in research, but still the fact that these are approximations might affect the final results of any study done with them.

The temperature data for the second time period is more accurate because of the availability of more modern tools. For instance, this data was acquired by satellites, which are generally more accurate than measuring stations which can be distorted by things like the urban heat island effect, etc.

There is also the question of whether to look at land or sea temperatures. Given that, relative to ocean surface temperature, the land surface temperature can be easily altered by small amounts of energy, we used the data from ocean temperatures alone and from the average of land surface temperature and ocean surface temperature together.

The graph below shows the development of different forcings and their varying effects from 1850 until 2012. From this figure, it looks visually like anthropogenic forcings have had a much bigger impact on overall temperature increase than natural forcings have throughout these 162 years. This is one hint that anthropogenic factors might have something to do with global warming, but it is by no means proof. Some forcings can have limited impact on global warming, even if they have some impact on the global temperature, so we analyzed them more closely.

Source: Miller et al (2014)

We can see in the above graphs that natural forcings occasionally have huge short term variations. This is mainly caused by volcanic aerosols. For example, Krakatoa’s eruption in 1883 sent large amounts of volcanic dust in the air and consequently affected the climate for the following few years. But volcanic eruptions have always been happening regularly, and if they have any effect on the climate, which they normally do not, it tends to be short-term. Nevertheless, like all of our other forcings, we tested whether or not they had a significant effect within the three years of recorded data.

Granger Causality Analysis

And so: Analyzing data from 1880 until recent times, we looked at how various combinations of forcings affected the temperature up to 3 years into the future.

Along with testing each individual forcing, we did some tests looking only at natural forcings (stratospheric, solar and orbital forcings), and some that tested different sets of anthropogenic forcings (well-mixed greenhouse gases, land use, tropospheric aerosols (both their direct and indirect effects) and ozone forcing).

Our aim was to see if we could find any sets of forcings that significantly affect ocean temperature and the combination of global surface and ocean temperature.

For the statistics geeks in the audience, it’s worth highlighting that for all of the different forcings and combinations of forcings we looked at multiple different models, some of which were less applicable to our time series than others. Like any statistical methodology, Granger causality analysis requires some mathematical assumptions regarding the quantities being modeled, and we wanted to make sure our results did not depend too sensitively on such assumptions. The modeling variations we looked at were,

  • “Trend” and “No Trend” The former of which assumes there is a constant trend in the temperature, and the latter of which does not
  • “Constant” and “No Constant” The former of which assumes the temperature is stable and the latter of which does not

Whether to include a constant or not depends on the stationarity of the time series we’re looking at, and whether to include a trend or not depends on whether or not there is a linear trend in the timing. What to include ultimately depends on the time series, but we decided to include all possible tests anyway. This way we were able to determine which made the most sense to focus our conclusions on after looking at the results. Ultimately, we determined it made sense to pay some attention to the models with constant, trend or trend and constant.

Results! Results! Results!

So what was the outcome of all this analysis?

The following results are presented as “p-values”, they represent the statistical significance of each variable or set of variables. All that are marked with p-values of 5% and under are considered statistically significant.

Natural Forcings (1880–2012):

On the left-hand side the results are for the average of ocean and surface temperature and on the right-hand side are the results for ocean temperature averages alone.

The table highlights the statistically significant results. Different colors represent different degrees of significance; green marks the most significant values and red marks the least significant of the statistically significant values; all values in white are not statistically significant.

Overall: It is quite clear that there are several sets of variables which show statistical significance in some models, but not in others. The significance of some variable-sets disappears if we choose to observe only ocean temperatures, and for others if we choose to look at only one of the time periods.

Natural Forcings (1958–2012)

Throughout the shorter time span we get a few more statistically significant cause and effect relationships between variables (causalities), but, again, some disappear when it comes to ocean temperatures alone. Less significant causalities are harder to detect with the ocean measurements because of how slowly the ocean temperatures change.

We get quite a different picture when we look at the results for anthropogenic forcings.

Anthropogenic Forcings (1880–2012):

The results are pretty damn clear: The frequency of statistically significant sets of anthropogenic forcings is much higher than it is for natural sets.

One can see that the “constant” model has many significant sets of forcings. But there are still fewer significant sets of forcings when it comes to detecting causality in ocean temperatures alone.

When we look at the shorter time span (1958 and 2012), we get similar results. We can see that there are fewer significant causalities in the short time span than there are in the longer ones, but still far more than there were for the natural sets of forcings.

Anthropogenic Forcings (1958–2012):

By having tested these longer and shorter time spans, temperatures from the combination of land surface and ocean surface, and ocean surface alone, we were able to see if there are some sets of forcings that are always significant. It turns out that there are three: the variables labeled {WMGHG, Ozone, TropAerInd}.

That is: well-mixed greenhouse gas, ozone, and tropospheric aerosols were found to be significant in all tested models, for both time spans, and for both types of temperature measurements.

This is a strong indicator that those three factors together are very likely to affect global temperatures within the span of 3 years.

There are also some other joint sets of factors that were found to be causing significant temperature changes in almost all models and time frames — one of which is the joint combination of well-mixed greenhouse gas, land use, ozone, and tropospheric aerosols. These together only lack sufficient statistical significance in the case of ocean temperature in the “trend and constant” model — and even there they are significant on a 10% level.

While finding sets of anthropogenic factors that are always significant was easy, the same clearly cannot be said for natural factors. With them, we still get causalities in some models, but they don’t persist throughout all of the other models. In particular, we see causalities in models with constants — but even just looking at a different time span or temperature source can end their statistical significance.

One of the most important things to notice from the natural forcings data is that, if we look at the more statistically significant sets of natural forcings, orbital forcings tend to be a part of them. The more factors we add on top of orbital forcings, the weaker the causality gets. While not conclusive, this is suggestive that orbital forcings might be a cause for some degree of long term temperature change that’s been happening for the last few hundred years. Their effect on climate seems to be relatively small compared to a recent sudden increase of temperature, so it is not likely much of an influence on the past 130 years of climate change, but it’s intriguing that it might be mildly impactful to long term temperature change.

The results are not far from what previous studies tend to show, but they are far more specific and clearly answer some important questions. This shows particularly clear evidence that not only has human influence likely been the driving force of recent climate change but that naturally occurring climate influencing factors have not. We did not test the exact contribution of each factor, because the Granger causality method is not ideally suited for this, but will dig into this and other aspects in further studies.

What’s Next?

What we have shown here is that setting aside the complexities of climate simulation modeling and looking at things on a pure number-crunching basis, there is a statistically very clear causal pattern between certain combinations of anthropogenic forcings and global temperature increase.

Based on our results it seems reasonable to assume that tropospheric aerosols, well-mixed-greenhouse gases, ozone, and, to some degree, land use patterns are the leading causes of climate change.

These conclusions should not strike anyone as especially shocking. But we have not previously seen any similarly clear, statistically rigorous demonstration of the causal impact of the combination of these factors on global temperature increase.

The next, fairly small step in this “explanatory” portion of our AI/sustainability research program will be to leverage SingularityNET AI technologies to do some in-depth predictive modeling. How accurately can AI predictive modeling, looking at the factors the Granger causality analysis has identified as relevant, predict global temperature going forward?

Beyond this, it may be interesting to look at potential synergetic use of simulation modeling and time series analysis methods to understand climate change. Perhaps time series analysis methods can be used to help automate the parameter tuning process for complex climate models, thus reducing the degree of overfitting and creating a more powerful framework for both understanding and prediction, leveraging both simulation modeling and the statistical/ML approach.

Ideally, we would like to be able to use climate simulation models and time-series analytics together to answer practical questions like, If we increased the amount of naturally sequestered carbon in the US by a factor of 1.5 over the next 10 years, what impact would this have on global temperature during the 10 following years? We are currently very far from being able to carry out this sort of analysis in a rigorous way, yet it’s not clear that this would be infeasible with a sufficiently sophisticated approach. Perhaps more biological aspects would need to be incorporated into simulations, or other radical improvements will need to be made; our hope is that a focus on rigorous predictive analytics can help direct research along impactful paths.

Appendix

The datasets we used in the study described here can be found at:

Forcings: http://data.giss.nasa.gov CMIP5 model.

Temperature: https://www.ncdc.noaa.gov/cag/

The Granger causality tools we used for this study can be found at Github.

The service will be accessible via the SingularityNET decentralized AI platform soon!

Join Us

SingularityNET plans to reinforce and expand its collaborations to shape the coming AI Singularity into a positive one, for all. To read more about our recent news click here.

And If you have an idea for AI tools or services you’d like to see appear on the platform, you can request them on the Request for AI Portal.

--

--