The carbon detectives

by Stephen Battersby

Even if countries agree to reduce greenhouse gas emissions, researchers face the monumental task of precisely monitoring the amounts of gases that are being emitted, and where. New tech will help, but the complications are many.

As shown in this artist’s conception, NASA’s GeoCarb mission, planned for a 2021 launch, will map concentrations of carbon gases from a high, geostationary orbit, allowing it to scan across an area the size of the continental United States every 2 hours or so. Image courtesy of NASA/Lockheed Martin/University of Oklahoma.

To escape the worst ravages of climate change, humans have a steep path to climb. The Paris climate accord negotiated in 2015 aims to limit global warming to below 2 °C, and ideally no more than 1.5 °C, requiring rapid and deep cuts in greenhouse gas emissions. Even assuming the big emitters make the necessary commitments, researchers and policymakers will need to monitor emissions closely to catch any stumble. If countries stray too far from their targets, there may not be time to make up ground before the Earth hits catastrophic climate tipping points.

At the moment, national emissions are calculated by accountants, based on the activity in different economic sectors. Coal-fired power stations emit close to 1 kilogram of CO2 per kilowatt hour of electricity generated, so multiply that emission factor by the total energy generation and you get overall emissions. Doing the same for manufacturers, farms, and other emission sources gives the national total. In developed nations, this bottom-up accounting process estimates CO2 totals with claimed uncertainties of about 5%, says Riley Duren at the Jet Propulsion Laboratory (JPL) in Pasadena, CA, who leads the Megacities Carbon Project.

This doesn’t sound so bad. But the approach has several drawbacks. “Methodologies are tricky to agree on,” says Annalisa Savaresi, specialist in environmental law at the University of Stirling in Scotland. “For example, the debate on how to measure land use emissions within the EU is incredibly contentious.”

“It’s hard to get right,” agrees Duren. “How confident can we be in those 5% numbers? There are examples where a country reported so much emission as the national level, but then researchers took data from the provinces and found that the total was much higher.”

In the future, the accounting system may be open to outright fraud. As carbon trading grows, there will be a powerful motive to cook the books rather than have to buy expensive carbon credits. Developing a check on accounts is vital, says Savaresi. An allegation of misreported emissions will be harder to deny if it is backed up by a network of independent measurements.

And maybe 5% is not accurate enough. There is a lot of inertia in the system, from slow technological and political change to the sluggish response of the climate, so we need early warning of any problem. “It’s like steering an ocean liner — you can’t start the turn 10 feet before the exit,” says Duren.

Meanwhile the uncertainties over methane emissions are much worse than for CO2 — often 50% or more. Methane from natural gas wells, livestock farming, rice paddies, and other sources accounts for nearly a quarter of human-made global warming. In March, a National Academies of Sciences, Engineering, and Medicine consensus study report highlighted the importance of tracking US methane emissions.

Can we solve all these problems using a top-down monitoring network to detect and measure emissions — something akin to a weather service for greenhouse gases? Maybe, but researchers and engineers have work to do. Detecting the faint wisp of fresh greenhouse gases in the vast and turbulent atmosphere of Earth turns out to be extraordinarily difficult.

Deploying all manner of approaches, researchers are developing a squadron of new satellites, trying to fund more sensors on the ground, and refining the mathematical models needed to make sense of it all.

Measurements from Above

Four carbon-tracking satellites are already in orbit: Japan’s Greenhouse Gases Observing Satellite launched in 2009; NASA’s Orbiting Carbon Observatory-2 (OCO-2) in 2014; China’s TanSat in 2016; and the European Space Agency’s Sentinel-5P, which went up in October 2017.

All these missions measure the spectrum of sunlight that has bounced off the Earth. CO2and other greenhouse gases absorb light at distinctive frequencies, and so in principle, measuring the amount of light absorbed can tell you how much of each gas is present. Allowing for the angle of observation, researchers can work out the average concentration of the gas along a line from top to bottom of the atmosphere. They can then see whether that concentration is enhanced relative to the background level.

This is difficult because it means detecting small changes in a large number. CO2 lingers in the atmosphere, and accumulated emissions have raised its global average concentration to around 410 parts per million. Emissions from a city might lift the local level by only a few parts per million above that background. The task is not quite as tricky for methane, partly because it is removed from the atmosphere more quickly. Hence, its background level is not as high relative to local enhancements.

OCO-2 has the tough task of detecting CO2, which it does using the molecule’s spectral signature at near-infrared frequencies. It was designed mainly to monitor the natural seasonal cycle of CO2. “The satellite has now operated for 3.5 years of a 2-year mission and exceeded most of its design requirements,” says David Crisp at JPL, the mission’s science leader. The original objective was a precision of 1 part per million, but the satellite actually has a precision of 0.5 parts per million, allowing the researchers to measure much smaller changes than originally planned. Data from OCO-2 already reveal emission hotspots in the Middle East that are not covered in existing inventories, pointing to gaps in the bottom-up accounting process.

This approach can point out sources of CO2 with ease but can only put an actual number on the rate of emissions in ideal circumstances. “If we see a plume of CO2, say from single power plant, then we can nail it,” says Crisp. In this simple situation, the faster the wind speed the higher the emissions needed to create a given enhancement in CO2 downwind. “Given the wind speed, we can quantify flux to about 20%.”

Usually the wind makes things fiendishly complicated, stirring up and mixing gases from a range of sources. Researchers run weather models to simulate the winds and trace the transport of greenhouse gases, but when observations are sparse, this is difficult. “Without prior information, there are countless emission distributions that could theoretically explain the atmospheric variations of CO2,” says modeler Paul Palmer at the University of Edinburgh in the United Kingdom. Researchers can get around this by nudging things in the right direction. Start with a first estimate of where emissions are coming from, based on the location of power plants, and so on. Run that through a weather model to see where they end up, and finally adjust your assumed emissions to fit the actual observations of CO2.

To pin down emissions, researchers need to identify very slight differences in CO2, so even small disagreements between different models add a lot of uncertainty to the process. “Model error is a massive problem,” says Palmer.

Models are improving, if slowly. Brute-force increases in computing power will help, by providing higher-resolution wind patterns. Meanwhile Inez Fung and her team at the University of California, Berkeley, are trying to perfect a new approach. They run many versions of their model at once to forecast changes in weather and CO2 every 6 hours and then adjust the forecast in light of both weather and CO2 observations. By mashing imperfect model forecasts with incomplete observations of both CO2 and weather, they aim to get the best approximation to the real atmosphere and a better idea of the uncertainties in CO2.

NASA’s OCO-2 reports global average CO2 concentrations, here for the time period of June 1–15, 2015. Image courtesy of NASA/JPL-Caltech.

Satellite Upgrade

At the moment, models can’t fill in the gaps between CO2 observations effectively because the gaps are just too big. “The current generation of satellites [was] not designed for this question,” says Duren. It doesn’t “provide wall-to-wall coverage.” OCO-2 for example scans only along a narrow line as it orbits. “It only covers about 7% of Earth’s surface every month,” says Crisp. The same will be true of that mission’s successor, OCO-3, which is due to be deployed in 2019.

Soon the window on methane emissions should be much clearer, thanks to the recent launch of Sentinel-5P — part of the European Space Agency’s Copernicus Earth-observing program. This satellite carries an instrument called Tropomi, which detects methane, nitrogen dioxide, and other trace gases in much the same way that OCO-2 detects CO2. Crucially, the satellite can take data simultaneously across a broad swath 2,600-kilometers wide. The low orbit of Sentinel-5P takes it over the poles, so as Earth spins, the satellite’s swath sweeps over the whole planet’s surface, amassing 20 million measurements with a resolution of 7 kilometers. Early signs are that it is working well. “I’m convinced that we will be able to verify and improve on existing methane emission inventories,” says mission manager Klaus Zehner.

The Copernicus program is also developing CO2 satellite missions, although the signal-to-noise ratio is much worse than it is for methane. To discern slight increases in gas concentration, a satellite has to devote more time to each observation. That makes such a wide swath impractical, so the aim is to get a swath about 200 kilometers wide, with a resolution of 2 kilometers. “Instead of one satellite, you will need at least three, maybe six, to cover the Earth,” says Zehner. The first of these, according to Zehner, could go up as soon as 2025.

“Model error is a massive problem.”

— Paul Palmer

Meanwhile, NASA is planning to give its GeoCarb mission a much loftier viewpoint. In 2021, GeoCarb is due to piggyback on a communications satellite, 36,000 kilometers up in geostationary orbit, giving it a constant view of the Americas. GeoCarb should be able to scan across an area the size of the continental United States every 2 hours or so, with a resolution of better than 10 kilometers, detecting CO2, methane, and NO2. With smaller gaps to fill between observations, the wind has less chance to muddle things, and modeling should be somewhat less important. “It will be almost at in situ mapping level,” says Berrien Moore, who leads the GeoCarb project. We will certainly know whether countries — even cities — are meeting their targets.” Adding two more geostationary satellites could cover most of the rest of the industrial world.

A shortcoming of all these satellites: they use reflected sunlight and can monitor only in daytime. One possible solution is mounting a laser on the satellite to probe the atmosphere. The European Space Agency is planning to do this for methane with the Merlin mission in 2020 or 2021. NASA’s proposed Ascends mission would do the same for CO2. “We will be carrying our own lightbulb,” says Ascends Principal Investigator Kenneth Jucks, based at NASA headquarters in Washington, DC.

But Jucks isn’t sure it’s feasible to use such instruments for blanket coverage. “You can’t measure everywhere, only along a line where the laser orbits Earth,” he says. “For broader coverage you would need a lot more lasers.” Zehner points out that active detection should give high-quality data — partly because you get extra information from back-scattered laser light — so even sparse observations would be valuable to validate other satellite measurements.

Ground Truth?

Regardless, the power of satellites may be overrated, according to Pieter Tans of the National Oceanic and Atmospheric Administration. Tans is head of the Carbon Cycle Greenhouse Gases group in Boulder, CO, which gathers and coordinates greenhouse gas data based on sampling from towers, balloons, and aircraft. The amount of CO2 in such samples can be measured directly, whereas satellites record a spectrum of light that has bounced off the surface of Earth and must then“This is one of those ‘everything and the kitchen sink’ situations. You need satellite observations, surface observations, other datasets, and modeling.” — Riley Durenwork out the quantity of CO2 using an algorithm to portray the transfer of radiation through the atmosphere. Smog, clouds, and ground features can all bias the results, says Tans. Aerosols in particular can scatter light, increasing the path it takes through the atmosphere and so mimicking a higher concentration of CO2.

The satellite teams do have ways to reduce these biases. To start with, they detect absorption of light by oxygen molecules. The amount of absorption lets researchers work out how much total oxygen is along the path of their observation, and because we know the concentration of oxygen in the atmosphere, that gives the path length — revealing the effect of scattering by aerosols. But this isn’t perfect. Oxygen and CO2 absorb at different wavelengths, which small-particle aerosols scatter to different degrees.

That’s where ground support comes in. Paul Wennberg, based at the California Institute of Technology in Pasadena, CA, runs a network of ground-based instruments called the Total Carbon Column Observing Network (TCCON). These detect CO2 and methane in the same way as the satellites, with one crucial difference. “We are looking directly at the sun, so the path is known perfectly,” says Wennberg. That means a major source of uncertainty is removed, so the satellite groups can compare their data with TCCON results to look for biases.

Sometimes OCO-2 data seem to show a CO2 variation that is not there in TCCON. In one case, the team saw what looked like anomalous CO2 in a band from the tip of South Africa around to Australia. They traced the signal to thin, high-altitude aerosols at 15–20 kilometer altitude, which are supplied by volcanoes and fires. “You can’t see these with the naked eye,” Crisp says. “They are almost impossible to detect with instruments, but they corrupt our data.”

The team uses this information to develop a bias-correction algorithm, tweaking it until the satellite results match those from TCCON. The upshot, says Crisp, is that they have already achieved a mean global error relative to TCCON of 0.4 parts per million and hope to get that down to 0.2 parts per million.

Expanding the TCCON network should help, especially in East Asia where stations are scarce, emissions are high, and smog is often dense. “We need to see if satellites are working in that kind of soup,” says Wennberg. Some new TCCON sites are setting up, such as Hefei in China and Burgos in the Philippines. Laser-armed satellites could play a role here, revealing small errors related to observing geometry.

But checking one remote observation against another has limitations, says Tans. To link satellite observations with calibrated in situ measurements, researchers can use balloons to take air samples as the balloons descend near a TCCON station, but that can help only up to a point. “We can never quite make an apples-to-apples comparison,” says Tans. “The descent takes about 25 minutes, it is not quite along the beam path of TCCON, and it lands perhaps a few kilometers away.”

Taking Flight

Aircraft could be the solution, Tans says. He already has a program using light aircraft at a few sites around North America. Each pilot receives what Tans calls a “James-Bond case”: a suitcase with automatic valves that take regular air samples as the plane descends, giving a full profile through the atmosphere.

Tans has been trying to scale up the program by persuading airlines to carry greenhouse gas detectors. “Something like 100 commercial airliners carrying calibrated instruments over North America would give us 700–800 vertical profiles per day,” he explains. “This would nail the transport models, at least over this continent, and it would also have the density to diagnose very small satellite biases. Two birds with one stone.”

“It’s not too big a stretch,” Crisp adds. “Sensors are small, and many aircraft already have high-quality water and temperature sensors that report to the weather industry.”

Adding more fixed measuring sites on the ground will also be important. Duren’s Megacities project, running in Los Angeles and Paris, is based on in situ sampling from radio towers and tall buildings. Because cities pump out about 70% of global fossil-fuel CO2, monitoring them along with power plants and other hotspots could be a vital check on bottom-up accounting.

“This is one of those ‘everything and the kitchen sink’ situations,” says Duren. “You need satellite observations, surface observations, other datasets, and modeling.”

The coming generation of satellites and other technologies will be able to do much better than the accountants in monitoring methane emissions and identifying CO2 hotspots. The harder task is providing a direct measurement of national CO2 emissions in light of measurement biases and modeling errors. “I’d be really surprised if we can get national emissions down to 5% and have the proof,” says Palmer. “But I’d also question whether you can really reach 5% with bottom-up accounting. It’s still important to try to reconcile the two.” In the short term, accounting in developed nations will remain the more accurate approach, but that means it can be used to verify satellite and other measurements, which can then be trained on nations where accounting is less reliable.

Spying on emissions certainly isn’t easy. Even by combining all the instruments at our disposal the system won’t be perfect. But given the myriad new tools coming online, researchers and policymakers should at least be able to tackle the climate path with one eye open.

Published under the PNAS license. Originally published on July 3, 2018 at pnas.org.

--

--