Product Update: Planetary Variables in Sentinel Hub

New datasets available in Sentinel Hub for Land Surface Temperature, Soil Water Content, Crop Biomass, and Forest Carbon.

Planet Stories
9 min readDec 15, 2023



  • Planetary Variables are analysis-ready data feeds that measure important conditions on the surface of the Earth including: Soil Water Content, Land Surface Temperature, Crop Biomass, and Forest Structure and Carbon.
  • Planetary Variables draw on observations from numerous public and commercial satellite missions to generate datasets that have broad, continuous geographic coverage and high-temporal resolution.
  • Sentinel Hub enables you to perform planetary-scale analyses with multi-temporal statistical analysis of earth observation data.
  • Now, Planetary Variables are supported in Sentinel Hub, enabling you to perform analysis on this data to generate insights.

Note: This blog has been crossed posted from Sentinel Hub’s Medium. Earlier this year, Sinergise was acquired by Planet. Planet is excited to publish more content here about the progress we are making to integrate our products with Sentinel Hub and Area Monitoring as we build an Earth Data Platform. If you are interested in technical information and updates on Planet products make sure to follow along here on the Sentinel Hub blog and on Planet Pulse.

Planetary Variables and Sentinel Hub

Planetary Variables are now a supported data layer in Sentinel Hub. You can order Planetary Variables to be delivered to your Sentinel Hub collections to benefit from all the analysis capabilities of Sentinel Hub APIs. For example, Planetary Variables can be used to:

A common type of analysis for Planetary Variables is to look at how much current year values deviate from historical means, commonly referred to as the anomaly. Sentinel Hub’s multi-temporal statistical analysis tools allow for you to efficiently extract statistics from deep time stacks of data using cloud processing. For example, the below chart shows how the current year’s Land Surface Temperature compares with the prior 4 year average. The Land Surface Temperature was colder than usual at the end of February and beginning of March, when compared to the prior years. For agriculture, soil temperature is an important variable for crop emergence.

Land Surface Temperature near Hanover, Germany. Current year (teal) is compared to the mean (black) and range (grey) for 2019–2022.

How does it work?

In this blog, we’ll explore how Planetary Variables work in Sentinel Hub through creating the visualization shown for Land Surface Temperature. We’ll look at:

  1. Creating a Planetary Variables subscription
  2. Visualizing Land Surface Temperature with Sentinel Hub evalscripts
  3. Analyzing time series using the Statistics API and Sentinel Hub Python SDK
  4. Performing regional analysis with the Process API

Creating a Planetary Variables subscription

A subscription is how Planet delivers data to your cloud storage. You provide an area of interest, time of interest, and data layer; in turn Planet will deliver all the data that matches the criteria. With this update, you can create subscriptions that deliver data into a Sentinel Hub collection when you use the Third Party Data Import API in Sentinel Hub.

Content-Type: application/json
Authorization: Bearer [Insert Sentinel Hub Token]

"input": {
"provider": "PLANET",
"planetApiKey": "INSERT PLANET API KEY",
"bounds": {
"bbox": [
"data": [
"dataFilter": {
"timeRange": {
"from": "2019-01-01T00:00:00Z",
"to": "2023-12-31T23:59:59Z"
"type": "land_surface_temperature",
"id": "LST-AMSR2_V1.0_100"
"name": "Land Surface Temperature Subscription"

After creating this subscription, Planet will fulfill the order by delivering data to your Sentinel Hub collection. All of the data is hosted in the Sentinel Hub cloud, which means you don’t need to worry about your own cloud storage to use Planetary Variables. You will be able to use Sentinel Hub APIs like the OGC, Process, or Statistical API to visualize and analyze the data.

Visualizing Land Surface Temperature with Sentinel Hub evalscripts

To create visualizations and perform analysis on your Planetary Variables in Sentinel Hub, you need to create an evalscript. Below is an example evalscript which applies a color ramp to the Land Surface Temperature data so it can be added to maps. It also returns a statistic for the average LST with an input area of interest, when used with the Statistical API. More custom evalscripts can be found here: Custom Scripts GitHub Repository.


// To set custom max and min values:
// Set defaultVis to false and choose your max and min values.
// The color map will then be scaled to those max and min values.
// Two LST observations (variable) are available:
// 1. "lst_1330" at 13h30 solar local time ("flagged_lst_1330" are masked pixel due to critical quality)
// 2. "lst_0130" at 1h30 solar local time ("flagged_lst_0130" are masked pixel due to critical quality)

const defaultVis = true
const min = 290 // default min: 263
const max = 330 // default max: 340
const variable = "LST"
const sensing_time = "1330" // "1330" or "0130"

function setup() {
return {
input: [variable, "dataMask"],
output: [{ id: "default", bands: 4 },
{ id: "index", bands: 1, sampleType: 'FLOAT32' },
{ id: "eobrowserStats", bands: 2, sampleType: 'FLOAT32' },
{ id: "dataMask", bands: 1 }] ,
mosaicking: "TILE"

// Filter out scenes where the sensing time matches specified sensing time
function preProcessScenes (collections) {
collections.scenes.tiles = collections.scenes.tiles.filter(function (tile) {
return tile.dataPath.includes("T"+sensing_time);
collections.scenes.tiles.sort((a, b) => new Date( - new Date(;
return collections

// Update the colormap for the specified range of values
function updateCMap(min, max) {
const numIntervals = cmap.length
const intervalLength = (max - min) / (numIntervals - 1);
for (let i = 0; i < numIntervals; i++) {
cmap[i][0] = min + intervalLength * i

// Colormap for visualizing Land Surface Temperature
const cmap = [
[263, 0x000004], [266, 0x06051a], [270, 0x140e36], [274, 0x251255],
[278, 0x3b0f70], [282, 0x51127c], [286, 0x641a80], [289, 0x782281],
[293, 0x8c2981], [297, 0xa1307e], [301, 0xb73779], [305, 0xca3e72],
[309, 0xde4968], [313, 0xed5a5f], [316, 0xf7705c], [320, 0xfc8961],
[324, 0xfe9f6d], [328, 0xfeb77e], [332, 0xfecf92], [336, 0xfde7a9],
[340, 0xfcfdbf],

// Update the colormap
if (!defaultVis) updateCMap(min, max);
const visualizer = new ColorRampVisualizer(cmap);

// Evaluate each pixel
function evaluatePixel(samples) {

// Calculate the data mask
let datamask=samples[0].dataMask;

// Calculate index value to be the most recent value
const scaleFactor = 100;
let indexVal = samples[0][variable]/scaleFactor;

// Return an RGBA representation using the colormap visualizer
[r, g, b] = visualizer.process(indexVal);
let imgVals = [r, g, b, datamask];

return {default: imgVals, index: [indexVal], eobrowserStats: [indexVal, datamask], dataMask: [datamask] };

To test this evalscript, we can use EO Browser or Requests Builder. In the screenshot below, we see the Land Surface temperature over this region near Hanover, Germany. The time series shows the Land Surface Temperature for the area outlined in blue. You can see how it varies through the year from cold in the winter to warmer in the summer (the units are in Kelvin, but can be converted to Celsius in the evalscript).

Land Surface Temperature evalscript tested in EO Browser.

APIs to integrate these analytics into your workflow, whether that’s building an EO-powered solution, doing data science in Python, or integrating with a GIS platform. Let’s look now at how to analyze this dataset further to identify areas with statistically significant deviations in Land Surface Temperature from a regional mean.

Analyzing time series using the Statistics API and Sentinel Hub Python SDK

To analyze Land Surface Temperature anomalies, we need to get the average Land Surface Temperature for every day in the past 5 years for my area of interest. In order to do this, I can use the Statistics API and the Sentinel Hub Python SDK. I can provide an area of interest, my collection ID, and my time of interest. In return, I can load the data into a Pandas data frame to create my charts.


from sentinelhub import (
from shapely.geometry import shape
import pandas as pd
import json

# Provide credentials to connect to Sentinel Hub
config = SHConfig.load()

# Define your Sentinel Hub Collection
collection_id = "INSERT COLLECTION ID" # Replace with a collection ID
data_collection = DataCollection.define_byoc(collection_id)
input_data = SentinelHubStatistical.input_data(data_collection)

area_of_interest = '''{
"type": "Polygon",
"coordinates": [
[1051125.720214, 6853649.652372],
[1052959.708825, 6886058.991507],
[1073197.592251, 6886670.492844],
[1075642.947505, 6853038.287526],
[1051125.720214, 6853649.652372]
area_of_interest = shape(json.loads(area_of_interest))

# Specifiy your time of interest (TOI)
time_of_interest = "2019-07-1", "2023-10-1"

# Specify a resolution
resx = 100
resy = 100

# Provide an evalscript
time_series_evalscript_path = "LST_Time_Series.js" #using the previous eval script
with open(time_series_evalscript_path, 'r') as file:
time_series_evalscript =

# Create the requests
aggregation = SentinelHubStatistical.aggregation(
evalscript=time_series_evalscript, time_interval=time_of_interest, aggregation_interval="P1D", resolution=(resx, resy)

request = SentinelHubStatistical(
geometry=Geometry(area_of_interest, crs=CRS("EPSG:3857")),

# Post the requests
download_requests = [request.download_list[0]]
client = SentinelHubDownloadClient(config=config)
stats_response =

# Parse the repsonse
# Load into a pandas dataframe
series = pd.json_normalize(stats_response[0]["data"])

# Clean up columns in the dataframe by selecting ones to remove
del_cols = [i for i in list(series) if i not in ["interval.from",
# Drop unused columns and rename remaining columns
series = series.drop(columns=del_cols).rename(columns={'interval.from': 'date',
'outputs.eobrowserStats.bands.B0.stats.min': 'minimum_lst',
'outputs.eobrowserStats.bands.B0.stats.max': 'maximum_lst',
# Calculate new columns
series["mean_lst"] = series["mean_lst"].astype(float)
series["date"] = pd.to_datetime(series['date'])
series["day_of_year"] = series.apply(lambda row:, axis=1)
series["year"] = series.apply(lambda row:, axis=1)
CPU times: total: 391 ms
Wall time: 28.1 s

date minimum_lst maximum_lst mean_lst day_of_year year
0 2019-07-01 295.079987 316.829987 301.833363 182 2019
1 2019-07-02 292.799988 312.269989 299.066470 183 2019
2 2019-07-03 295.329987 317.220001 302.064182 184 2019
3 2019-07-04 295.480011 315.109985 301.684987 185 2019
4 2019-07-05 295.209991 314.200012 301.149056 186 2019
... ... ... ... ... ... ...
1548 2023-09-26 293.160004 312.660004 301.140149 269 2023
1549 2023-09-27 294.010010 313.989990 302.586067 270 2023
1550 2023-09-28 292.859985 312.920013 301.139032 271 2023
1551 2023-09-29 289.239990 304.160004 295.336886 272 2023
1552 2023-09-30 289.839996 303.559998 295.905555 273 2023

1553 rows × 6 columns

In less than 30 seconds, this analyzed more than 1,500 days of observations across 250 square kilometers at 100 meter resolution; all without downloading any imagery. With the time series data, you can create charts to show the climate anomaly. We can calculate the mean and range for the prior 4 years, shown below as the black line and grey area, respectively. The teal colored line is a 10 day average of the current year’s Land Surface Temperature.

Land Surface Temperature near Hanover, Germany. Current year (teal) is compared to the mean (black) and range (grey) for 2019–2022.

Performing regional analysis with the Process API

In the above example, we looked at how to create a time series for a known Area of Interest. But if you are analyzing a larger region, you may need to see a map instead of a time series to highlight locations with the most significant anomalies. To do this, we can use the Process API. We can adjust our evalscript to take a given date and in return calculate the climate anomaly for the prior 14 days. Our new evalscript will work like this:

  • Filter for tiles that are in the prior 14 days and the tiles for the same 14 day time period from the prior 4 years.
  • Calculate the daily average Land Surface Temperature for the prior 4 years
  • Calculates the difference between current year values and prior years averages for each day in the 14 day window
  • Returns the cumulative 14 day climate anomaly

When this analysis is run for every day over a given year, we can create a time lapse which shows areas with the lowest and highest LST anomaly. Blue represents areas that have an average LST below the historical mean, while red represents areas with an average LST above the historical mean. The time series corresponds with the area outlined in black.

Time series plot (left) for the area of interest outlined on the map (right). The map shows the temperature anomaly on a given day for that location compared to prior 4 years.

How to get started?

If you already use Planetary Variables from Planet, you can sign up for a 30 day free trial to Sentinel Hub here. You can find information in our documentation on how to use EO Browser and Sentinel Hub APIs with Planetary Variables.

Interested in Planetary Variables? Contact Planet to learn more.