The IT Landscape for Climate Services

Overcoming the challenges of big data and interoperability

David Huard
birdhouse
4 min readApr 6, 2018

--

Climate services deliver information that improves people’s understanding of climate change and its impact on both the natural environment and society. An objective of these services is to support a proactive, evidence-based adaptation to climate change to avoid reactive, unplanned responses to natural catastrophes. Climate services can touch health, engineering, forestry, agriculture, tourism, finance, energy or any sector exposed to weather and climate conditions.

Climate services are typically created by teams of climate scientists, sector specialists, and facilitators. Such collaboration facilitates the development of services that are scientifically defensible, understandable and useful. Decision-makers may then rely on them to mitigate future risks or invest in new economic opportunities. Offering climate services is however a balancing act: on one hand, climate scientists rightfully ask that climate services reflect the latest development in climate science, while on the other hand, decision-makers want information that has been thoroughly vetted and verified, but simple enough to be understood by their constituents.

As the need for climate change adaptation becomes more urgent, countries are increasingly setting up climate service centers to provide sector specific impact scenarios and respond to decision-makers’ questions and requests for climate information. These efforts are in line with international efforts such as the U.N.’s Paris Agreement and the WMO’s Global Framework for Climate Services. One of the main logistic challenges faced by these climate centers is to connect climate science, sectoral knowledge and stakeholder needs.

Challenges

Climate service specialists face a number of technological challenges. One is related to the volume of data to be summarized. Already, the data volume generated by all the climate models is too large to be hosted by a single modeling centre and must be globally distributed. This problem will only get worse with the next wave of satellites and models operating at higher time and space resolutions. Climate services are becoming a big data problem. A second challenge has to do with the inescapable transdisciplinary nature of climate services. By definition, climate impacts are at the intersection of climate and sectoral sciences. The integration of climate data with sectoral impact models and economic analysis is fraught with logistical and technical difficulties. Finally, climate services should not be restricted only to countries with high bandwidth and large computing infrastructures.

Solution

Part of the solution to the big data problem and bandwidth limitations is to run the analytics (model and analyses) on servers that are collocated with the data archive. To do so, there exists a standard called Web Processing Services (WPS), which defines a language for geospatial server requests. External users can, for example, send a request for a mapping of future sea level rise by posting a special URL storing the service name and its arguments. The server processes the request, with data stored locally, and returns the user a link to a map created in near-real-time. The value of using the WPS standard instead of a custom interface is that different servers from independent organizations can more easily interact with each other, and scientists can combine various services to conduct complex, transdisciplinary analyses.

Climate services IT infrastructure.

Collaboration

There are now many research groups developing climate services using the WPS standard. This service delivery approach, presented in figure 2, does not require users to maintain data servers, which for complex algorithms and software can present a significant challenge. It also relieves the users of maintaining computing infrastructure for the underlying datasets and processing workflows. Because the algorithm’s performance and validity is public, there is an incentive for the service provider to thoroughly test, validate and improve it. In fact, the service model creates a pipeline for innovation, where experts can develop new and improved algorithms for an existing community of users.

Vision for development and cooperation

Because the data products transmitted to users are typically orders of magnitude smaller than the raw data necessary to produce them, this service model provides low-bandwidth countries the same services as high-bandwidth countries, leveling the field for climate impacts and adaptation resources.

Our vision for the future is one where national and private meteorological and climate service centers expose specialized services through WPS servers. This would promote international collaborations, improve the quality of services offered, and facilitate the creation of new climate service centers in countries that lack robust technological infrastructures. Users that currently have no access to climate services could take advantage of publicly available servers from other regions. Also, as data volumes and computational requirements grow, this model insulates users from the scientific and technical complexity underlying these tools, while specialists can focus on developing new algorithms. Our hope is that over time this will give rise to a competitive market for climate services mixing both the public and private sectors.

Current Initiatives

Consistent with this vision, there have been a number of open source initiatives in recent years to develop globally accessible, public, climate services. Some of the organizations involved in these efforts are Ouranos and the Computer Research Institute of Montreal (CRIM) in Canada; the German Climate Computing Centre (DKRZ); the Institut Pierre Simon Laplace Centre (IPSL) Climate Modelling Centre in France; and the National Oceanic and Atmospheric Administration (NOAA) in the United States.

--

--