On November 5–7 around 40 people gathered in Arlington, VA, with another 20 or so participating online, for a joint sprint on the SpatioTemporal Asset Catalog (STAC) and OGC API — Features (OAFeat) specifications. It was our 5th STAC sprint, and the second one we’ve done with OAFeat (formerly WFS 3). I’m pleased to report it was a big success, and to me, it felt like the most productive one we’ve had yet. It was awesome to see everyone working away, on many diverse parts of the ecosystem. In this post, I’m going to attempt to do a brief overview of all that happened.
Though before I get into it, I do want to thank all our generous sponsors for their support — the event would not have been possible without them. IQT CosmiQ Works hosted us at their awesome office, Planet came in as a convening sponsor, OGC sponsored and helped out with logistics, Element 84 hosted the happy hour, Azavea was a supporting sponsor, and Radiant Earth Foundation helped with logistics.
Introductions and Kick-off
We kicked off the first day with a round of introductions, rotating between everyone gathered in-person and remote. We had a number of people fly in from Europe and Canada, and remotes joined from as far away as the Philipines, joining at 10:00 pm. It was great to see so many people from different backgrounds coming together to work on common standards, and it’s nice to see the communities grow with each sprint, welcoming in new faces.
The focus for the day was to kick off the joint work between STAC and Features API, particularly a powerful Filter language that both could use, and tackling a number of other parts of a full-fledged Query, like sorting, paging, requesting specific fields and more.
While sprints are a time for the specification experts to go deep on major topics together, they are also a prime opportunity to welcome new people into the community. Building on some ad-hoc sessions at the last STAC sprint, we decided to have 4 explicit ‘Beginner Sessions’, so those who are newer can get up to speed and ask all their newbie questions in a friendly environment. We kicked off with a walkthrough and Q&A of STAC, and then Rob Emanuele gave a great introduction to creating and working with STAC catalogs using PySTAC, so participants didn’t just have the theory of STAC but also some practical hands-on experience. We streamed out all the sessions, but only managed to record the PySTAC one, which you can see on youtube.
On day two we repeated the structure of walkthrough and then hands-on, but this time Peter Vretanos gave the overview of OGC API — Features, and Tom Kralidis did a great introduction to pygeoapi, a leading OAFeat implementation (that actually implemented STAC during the sprint!). The sessions overall seemed to work quite well, and hopefully, at the next sprint, we’ll have even more sessions, diving deep into various tools that people can use.
Another new tradition that started at the last sprint and we continued this time was a happy hour, hosted by Element84 at their office in Alexandria. And in our effort to make these events more remote-friendly, this time we managed to record the lightning talks! The video mostly shows the presenter, so I’ve included a link to the slide decks where possible, so you can follow along.
- Aaron Su talked about all the machine learning projects at Azavea and how they use STAC at the core (slides).
- Matt Hanson gave an overview of the core STAC ecosystem projects he and others work on at Element84.
- Tim Schaub presented on the support of OGC API — Features at Planet, particularly in Analytic Feeds, and the funding of GDAL and QGIS plugins (slides)
- James Banting talked about putting Radarsat data into STAC (available at radarstac.comand other work STAC work SparkGeo is doing (slides).
- Janne Heikkilä shared the Java OGC API — Features implementation that the National Land Survey of Finland has built.
- Renee Pieschke shared the latest on Landsat’s adoption of STAC, how they are migrating to the cloud and will put the entire Landsat collection in STAC, including past missions as well as the newest products like surface reflectance and surface temperature.
- Tom Kralidis presented on pygeoapi and its implementation of OGC API standards, plus its use at the Meteorological Service of Canada (slides — he gave the top-level ones for this talk).
- Alexandra Kirk talked about using STAC for collaboration around agriculture use cases that Climate Corporation has been tackling.
- Alexander Frank gave an overview of use of STAC at Maxar to centralize their inventory of disparate catalogs.
It was amazing to see the diversity of perspectives coming together around standards, and how these two very young specifications are already tackling major real-world problems.
We also joined the GeoDC Meetup on the next night, sharing about STAC and OGC API. It was great to interact with the local geospatial community, and we hope to repeat that at future events.
It would take a lot of text to recap all that happened at the sprint, with so many great discussions throughout. I think the major results will speak for themselves, in terms of improvements to the STAC and Features API specifications, and an ever wider ecosystem of implementations. But I’ll quickly go through some of the highlights. There are rough notes on everything we talked about at, with lots of links to the work that was done. And we also recorded the full session (though unfortunately missed the very beginning — an update from Peter Vretanos on work he did on his server and the OGC API — Features specification), you can watch it on youtube.
pygeoapi — As mentioned above, pygeoapi implemented STAC support during the sprint, which was awesome to see come together. Several other features were also worked on, including cross-collection search, many: many feature: collection connections, improvements to the postgresql provider, and several more. Tom also implemented one of the things I’ve wanted to see — automatically making an OGC API — Catalog from the collection's metadata.
Ordnance Survey put together a Java springboot server implementing OGC API — Features that served up OS data. A prototype is up at os-ogc-features-api.azurewebsites.net/ and Ordnance Survey seems quite committed to continuing in this direction. It was awesome to see that the spec is easy enough to make a complete server from scratch in less than three days.
Franklin from an awesome team at Azavea blew me away with the completeness and polish that emerged during the sprint. The team leveraged GeoTrellis and PostGIS to create a full dynamic server that can serve up static STAC catalogs as both STAC and OGC API — Features. Before the sprint, they put up a great post on their STAC work, and I look forward to the update detailing all they did at the latest sprint.
STAC Validation — James Banting and Alex Kirk made a number of great improvements to the STAC validation tools and hooked them up to the continuous integration in the specification.
sat-api-pg was a project that Development Seed had been working on, and they used the sprint to get the code all ready to release as open source! It provides a full STAC implementation but backed by PostGIS instead of ElasticSearch like many of the first STAC servers. Youhttps://medium.com/devseed/sat-api-pg-a-postgres-stac-api-af605cafd88d
Astraea stood up MODIS data to Earth on AWS as Cloud Optimized GeoTIFF’s as part of Earth on AWS MODIS, and indexed it into the Astraea public STAC API. Their API also contains Landsat-8, Sentinel-2 L1C and L2a, and drives Earth OnDemand.
Resto and Rocket — The largest (awesome) surprise for me of the sprint was the amazing work by Jerome Gasperi of SnapPlanet. Rocket is one of the coolest STAC clients I’ve seen, and almost every service in the sprint was tested with it — not just STAC but also pure Features API services. And Resto is a robust metadata server with a wide userbase that recently got STAC and OGC API — Features support.
Spacebel showed off their awesome work on aligning various specifications in their server, with a great presentation on future work for even better alignment. They got their server at databio.spacebel.be/eo-features/ to implement OGC API — Features, Open Search for EO, and they added STAC during the sprint and got it working with various clients.
MLHub Catalogs — Kevin Booth from Radiant Earth Foundation stood up two static catalogs with their MLHub training data, one on OSM Generated Training Data and the other Landcover Classification / Building Footprints / African Crops.
GDAL — Even Rouault advanced the GDAL OGC API — Features driver to support to the new specification ideas discussed and implemented during the sprint, including using XML Schema or JSON Schema to get the structure of the data, using a Queryables endpoint, and using CQL as a filter language to filter on the server side.
nls-fi Features server implemented support for the two filter language variants discussed: json-filter-expr and cql-json-array. You can see a sample request: tieviiva (roadlink) features in bbox 24.00,66.00,24.05,66.05 and kohdeluokka >= 121111 and kohdeluokka <= 12132.
Phew, that was a lot of amazing implementation work, and this post is getting quite long. I think we’ll hold off on the specification work, and will hopefully detail it in posts for the next releases of the standards. Working OAFeat and STAC specifications in conjunction were really great, and most all the core things we set out to do had huge progress. STAC had a ton of great discussion and mapped a path to get to 1.0-beta early next year. And the next set of powerful extensions for OAFeat is really coming together. We even snuck in some productive conversations about OGC API — Catalog, which is starting up in earnest. We’ll share more about all the specification improvements soon.