Spatially align a time-series stack of ICEYE SAR images with a dockerized ESA SNAP routine

Arnaud Dupeyrat
ICEYE Analytics
Published in
6 min readJan 11, 2022

This is the third blog of a five-part series on the AI4SAR project. If you haven’t heard about AI4SAR yet, now is a good time to see our project page!

After the release of the icecube toolkit, we decided to develop a dockerized Sentinel Application Platform (SNAP) coregistration routine to reduce the burden of spatially aligning a stack of time-series synthetic aperture radar (SAR) images.

In this blog, we will see how you can coregister ICEYE images using SNAP without the burden of setting up your environment and installing additional software. At the end of this blog, you’ll learn how to get a georeferenced product with a single line of code.

AI4SAR, a project sponsored by ESA Φ-lab, is an attempt to lower the entry barrier to SAR-based machine learning applications.

Image by Ajda ATZ on Unsplash
Image by Ajda ATZ on Unsplash

But what is coregistration?

Coregistration is a time-consuming but foundational pre-processing step that you must perform when you work with time-series stacks of SAR images. It ensures that features in one image overlap with those in the other images that will be stacked together.

Precise coregistration of SAR images is a nontrivial task because a change in the acquisition geometry generates image shifts that, in principle, depend on the topography.

While we were preparing to publish this blog, we took the opportunity to release a time-series stack on the ICEYE website so that our community can have access to a real dataset to test the icecube toolkit.

To illustrate this shift, let’s use two images from this public dataset that we released in November 2021. Image ID 71820 is the primary or baseline image and image ID 73908 is the secondary image. If you try to overlay the different rasters on top of each other, you will observe some interesting effects.

For example, if you build a two-band composite where the first layer is the primary image (green channel) and the second layer is the secondary image (red channel), you will notice a 3D effect around the edges. This effect indicates that the primary image is not aligned with the secondary image.

(Image by ICEYE) Illustrates the 3D effect you notice around the edges if you overlay two rasters without spatially aligning them. The green tinge marks the edges of the first image and the red tinge marks those of the second image.

But after coregistration, the edges align with each other and the new composite of the primary image (green channel) and the secondary (coregistered) image (red channel) is clear.

(Image by ICEYE) Illustrates the sharpness of the edges after spatially aligning the two images.

Hopefully, these two images helped you grasp why coregistration is a necessary step in a time-series analysis. In fact, even a slight shift of a few meters between two rasters can ruin your analysis because the pixels won’t represent the same object on the ground.

Broadly speaking, coregistration requires the identification of common features like road junctions or stream crossings in the primary image and in the warping of the other images (that is, those to be coregistered) to match the common features of the primary image.

Dockerizing SNAP

SNAP is an open-source, common architecture for all ESA toolboxes that is ideal for the exploitation of Earth Observation (EO) data. While SNAP is primarily used to manipulate Sentinel data, you can use it for ICEYE data as well.

Developing a layer on top of SNAP to coregister ICEYE images was a natural progression of our ongoing effort to leverage the existing open-source tools and strengthen our collaboration with ESA Φ-lab.

But why are we dockerizing the solution if you can download SNAP and run the coregistration function from its UI?

The primary motivation of the AI4SAR project is to lower the entry barrier. We are aware that not all icecube users/ML practitioners are familiar with remote sensing products and pre-processing steps, such as coregistration, so we wanted to reduce the number of operations that they must perform before they get to the icecube toolkit.

In the same vein, is our need to provide an end-to-end solution. The dockerized routine includes a module that will map the metadata of the coregistered images so that it complies with the ICEYE product guidelines. This way, our users don’t have to bother themselves with the intricacies of dealing with proprietary data. Finally, we want the coregistration step to be integrated into different pipelines, and that’s easily achieved through a command line interface (CLI).

You can use any method to coregister the images, and you will still be able to use your coregistered images with the icecube toolkit. The idea with this dockerized routine is to open source the complete processing chain (coregistration -> stacking -> tiling) so that anyone can analyze the ICEYE time-series data (and you have a free dataset to play with!)

Let’s now look at what’s happening behind the scenes…

The docker uses the SNAP 8.0 graph builder to coregister two images. Users can choose one of the three methods to coregister these images (dem_assisted, cross-correlation, and create stack). These are represented by three different SNAP execution graphs inside the docker.

When coregistration is done, a metadata wrapper assigns the correct metadata to the coregistered images. The following image illustrates this process.

(Image by ICEYE) Illustrates the dockerized coregistration process, including the georeferenced output that you can load into the icecube toolkit or your geospatial application of choice.

Coregistering two images

Perform the following steps to coregister two images:

DEM-assisted is the default coregistration method. For more information on this method, see this IEEE journal paper.

  1. Download the publicly available dataset comprising four ICEYE images taken within 12 days.
  2. Select any two GRD images from the dataset and move them to one folder. For example, ICEYE_GRD_SM_71820_20210722T055012 and ICEYE_GRD_SM_73908_20210727T055021.
  3. Pull the snap_coregister image from dockerHub. The docker image requires 16GB of RAM (Change the image tag from latest to gpt27go if you have 32GB of RAM)
docker pull iceyeltd/snap_coregister:latest

3. Type the following commands to coregister the GRD images.

cd path/to/your/images
docker run -v $(pwd):/workspace iceyeltd/snap_coregister:latest -pp workspace/name_file_primary -sp workspace/name_file_secondary -op workspace/name_output.tif
pp: primary path
sp: secondary path
op: output path
Note: You can also write --primary_path instead of -pp, --secondary_path instead of -sp, and --output_path instead of -op.extra options
-ct, --coregistration_type = [stack|dem_assisted|cross_corr]
-cp, --config_path = local path to the config yaml

4. (Optional) Use coregistration_type to change the default coregistration method. You can set it to cross-correlation or create stack. Additionally, you can use config_path (local file path) to edit the config.yaml file that contains the parameters associated with each method.

Here are the default parameters for each coregistration method.

For more information on each parameter, see the SNAP help content.

(GRD and SLC) For the DEM-assisted method, you can modify the following parameters.

dem_assisted_params:
demResamplingMethod: BILINEAR_INTERPOLATION
resamplingType: BILINEAR_INTERPOLATION

(GRD) For the cross-correlation method, you can modify the following parameters.

cross_correlation_params:
nb_gcp : 8000
coarseRegistrationWindowWidth: 128
coarseRegistrationWindowHeight: 128
rowInterpFactor : 4
columnInterpFactor: 4
maxIteration: 2
onlyGCPsOnLand: false
computeOffset: true
gcpTolerance: 0.25

wrap_params:
rmsThreshold: 0.05
warpPolynomialOrder: 1
interpolationMethod: Cubic convolution (6 points)

(SLC) For the cross-correlation method, you can modify the following parameters.

cross_correlation_params:
nb_gcp : 10000
coarseRegistrationWindowWidth: 128
coarseRegistrationWindowHeight: 128
rowInterpFactor : 4
columnInterpFactor: 4
maxIteration: 2
onlyGCPsOnLand: false
computeOffset: true
gcpTolerance: 0.5
fineRegistrationWindowWidth: 32
fineRegistrationWindowHeight: 32
fineRegistrationWindowAccAzimuth: 16
fineRegistrationWindowAccRange: 16
fineRegistrationOversamplingif: 16
applyFineRegistration: true

wrap_params:
rmsThreshold: 0.05
warpPolynomialOrder: 2
interpolationMethod: Cubic convolution (6 points)

Wrapping up

We hope that the dockerized SNAP coregistration routine can help you get a georeferenced output with a single line of code, and it doesn’t require you to install additional dependencies. But note that the routine is a wrapper around SNAP. Its intent is to make the process a bit easier, not to replace SNAP.

If this blog piqued your interest, jump right to this detailed notebook to build your first ICEcube now!

--

--