Planet, People and Pixels: A Data Pipeline to link Planet API to Google Earth Engine
Every once in a while, an idea has the capability of taking root and becoming a tool for change. One of the most powerful afterthoughts that followed Planet’s decision to capture the whole world using a collection of satellites was the unprecedented capability of analysis. As Planet opened up its data using the Open California program, the Ambassadors program and now the Education and Research Program there was a need for platforms capable of handling these analyses.
With the need to handle such global-scale and often massive data sets, Google was already building and hosting a platform called Google Earth Engine(GEE) designed to analyze geo-spatial data.
I found the connection between Planet API and GEE was an easy one to make. As a researcher and developer my question was a simpler one: would it be possible to build a pipeline that allows users to batch-download data sets and upload to GEE for analysis? The following is a tool developed as a result of that inquiry and steps on how to use it.
If you ever hear the word CLI in relation to a programming language or tool you can almost always assume that it refers to a Command Line Interface and I will be using this throughout the article. It is basically a program I can use on a UNIX terminal or windows command prompt, which can be embedded or called easily by the system.
The things you need first (Housekeeping & Setup)
You need a planet account which gives you access to your very own API key. If in doubt use the link to go to your accounts page. (The free
account gives you access to California data under a CC BY-SA 4.0
license). You also need a Google Earth Engine account with access to the API
- You need the Planet API and the Google Earth Engine API installed on your system
To install the Python Client Library you can go here and for the Planet API, you can simply type:
pip install planet
Since google earth engine API has a slightly longer install process and it gets updated frequently you might want to install using instructions.
- The only dependency that does not install automatically using requirements.txt is GDAL. For installing GDAL in Ubuntu
sudo add-apt-repository ppa:ubuntugis/ppa && sudo apt-get update
sudo apt-get install gdal-bin
For Windows I found this guide from UCLA
- To finally install you can go to the GitHub page at Planet-GEE-Pipeline-CLI. As always two of my favorite operating systems are Windows and Linux, and to install on Linux
git clone https://github.com/samapriya/Planet-GEE-Pipeline-CLI.git
cd Planet-GEE-Pipeline-CLI && sudo python setup.py install
pip install -r requirements.txt
for windows download the zip and after extraction go to the folder containing “setup.py” and open command prompt at that location and type
python setup.py install
pip install -r requirements.txt
Extra Goodies: I have also included an executable tool for windows within the github repository so users who have native python installed (tested on 2.7.13) should be able to just double click and install the CLI.
You can use the tool without any installation by simply going to the ppipe folder and typing
python ppipe.py If you have installed everything and your dependencies are met you should be able to type
in command prompt and get a list of all the tools. Since I have designed the tool to be powerful addon tools to earth engine as well I will only deal with the ones responsible for processing and uploading Planet assets but feel free to explore the other tools included in the tool-set.
Registering your Planet and Earth Engine API
The tool is designed to save your API key so that you don’t have to do this again and again, a new users API key overrides yours.
To set a planet key enter the key when asked for password
This setup also assumes you have registered your google earthengine api by running “
earthengine authenticate” in your command line or terminal window.
With everything set up lets get started, You only need to provide your Planet API key once, since it is saved for future use.
Getting the Area of Interest(AOI) Ready
The area of interest is often the first question you ask yourself when you want to observe something. It is guided by your own interest and expertise on the subject, the process you want to observe, the time and methods needed to analyze such datasets. In case you are interested in creating a initial area of interest file you can simply go to geojson.io and once you have defined your area of interest click on save as GeoJson (A map.geojson file is created).
For the ease of use I am choosing and working on an area within California since this is covered under the Open California license and should be easily accessible to anyone. I am providing the map.geojson and the aoi.json file so that you can compare the file structures and can replicate the same process. It is a great idea to go into the planet explorer first and toggle your available area, that way you can be sure that you have access to the area you are trying to download.
The aoijson tool within the ppipe bundle allows you to bring any existing KML, Zipped Shapefile, GeoJSON, WKT or even Landsat Tiles to a structured geojson file, this is so that the Planet API can read the structured geojson which has additional filters such as cloud cover and range of dates. The tool can then allow you to convert the geojson file to include filters to be used with the Planet Data API.
Let us say we want to convert this map.geojson to a structured aoi.json from June 1 2017 to June 30th 2017 with 15% cloud cover as our maximum. We would pass the following command
ppipe.py aoijson --start “2017–06–01” --end “2017–06–30” --cloud “0.15” --inputfile “GJSON” --geo “local path to map.geojson file” --loc “path where aoi.json output file will be created”
Activation and Download
The data API activates assets only when requested by a user rather than keeping all assets activated all the time. So for the area of interest we created the next tool we use is the activatepl tool which allows you to activate or check the activation status of the assets within our area of interest. You can request activation for any planet asset and for now I am interested in just PSOrthoTile analytic.
The setup for asset activation for aoi.json will be
ppipe activatepl --aoi “local path where you create aoi.json file “ --action activate --asst “PSOrthoTile analytic”
You can then periodically check the progress on activation
ppipe activatepl --aoi “local path where you create aoi.json file “ --action check --asst “PSOrthoTile analytic”
The next tool in our toolbelt is dowloadpl allows for downloading the tool includes local path to save your assets. Planet assets are varying size do make sure you have enough space available on your drive. The setup is
ppipe downloadpl --aoi “local path where you create aoi.json file “ --asst "PSOrthoTile analytic" --pathway "Local path to save your assets”
Let us Parse
Google Earth Engine requires the metadata files to be parsed so that these can be associated with imagery and can be used to filter or used during analysis. The metadata tool allows you to parse any and all Planet asset for now excluding Landsat and Sentinel Imagery. The output is a csv file to be used during the uploading
Uploading to Earth Engine
The last step includes pushing these assets to Google Earth Engine and we will use the upload tool the setup for which includes all the components we created till now
ppipe upload --user "email@example.com" --source "Local path with tif files" --dest "users/assetlocation/collectionName" -m "metadata file.csv" --nodata 0 (nodata is optional)
Future projects and plans
I would like to be able to estimate size of assets before downloading along with the available size left in your disk to make sure download will not fail. This will be included in future iteration of the tool. For the area of interest we created estimated size of assets to be downloaded is about 10.57 GB
Auto-suggest with tab completion for all tools will allow you to use all the tools without the need for calling on the help command.
Building a Graphical User Interface(GUI) seems to be another interesting application that many users may enjoy which basically wraps the CLI into a GUI. You can test the GUI which is available as a release.
Our experiments , our need to learn and to make things simpler and build upon the things we know inspires the next step. The need to ask the big question, what do you do with all the data? how do you play? How do pixels connect to people?
You can cite both the tools using
- Samapriya Roy. (2017, June 19). samapriya/Planet-GEE-Pipeline-CLI: Planet-GEE-Pipeline-CLI. Zenodo. http://doi.org/10.5281/zenodo.810548
- Samapriya Roy. (2017, June 25). samapriya/Planet-GEE-Pipeline-GUI: Planet-GEE-Pipeline-GUI. Zenodo. http://doi.org/10.5281/zenodo.817739
Thanks to Joseph Mascaro and Dana Bauer at Planet for comments & support. I am further grateful to Doug Edmonds at Indiana University for support and the Planet Ambassadors program for letting me play around with the data and the opportunity to learn. A portion of the work is supported by JetStream Grant TG-GEO160014.
For more questions about this project, and for further discussion, you
can email me. You can also reach out to Dana Bauer, who is part of
Planet’s Public API team, at firstname.lastname@example.org, or the Planet support
team at email@example.com.