Boosted by the European Space Agency: from idea to MVP (part 2)

Daniel Meppiel
Wegaw
Published in
5 min readDec 20, 2018

After being awarded a contract from ESA for a Feasibility Study (read more on how we got there in part 1), we baptized the project and subsequent derived product “DeFROST” and started the hard work: proving the technical and most importantly, economic viability of such product.

To do that, an ESA Feasibility Study typically structures the work in 4 major blocks: analyzing and defining the user requirements, defining the required system and service, evaluating the economic viability by producing a business model and finally, developing a prototype.

What do users of DeFROST need?

Within the feasibility study, we targeted two user segments which could benefit from accurate data of terrain conditions, particularly on snow data (snow cover, snow depth and related avalanche risk — read below on why we focused on snow first): tourism offices, or so called Destination Marketing Organizations (DMO), and Digital Outdoor Platforms (DOP), or providers of B2C digital products for outdoor activity planning (e.g. FATMAP, Komoot or Alltrails).

Use case of DeFROST for tourism offices

After several user interview sessions, where we applied the well known Lean Startup methodologies combined with Ash Maurya’s frameworks, we got a first good impression on what would deliver the best value with a good effort compromise. Needless to say:

  • Frequently updated high-resolution data on multiple terrain conditions
  • An easy way to interpret such data by non-experts and
  • Smart integrations with their existing operational processes and products based on that data.

The system and service foundation

As we learned what the users cared about, we designed a system and service architecture optimized for extendability, scalability and maintainability: when launching a new venture at an early stage, requirements can quickly change!

The designed system focuses on the aggregation of data from 3 primary sources:

  • Remote-sensing raw data (satellite, drones, or any other aerial source)
  • Ground data from ground sensors
  • Meteorological data on past, current and future weather

This basic, raw data, needs further processing and merging in order for us to extract meaningful value from it and ultimately integrate for specific use cases. Specially, the field of Machine Learning was identified as very promising in terms of helping to cope with raw data gaps and traditional data analysis techniques on Earth Observation data (e.g. example on Amazon Web Services, or Synergise’s with eo-learn, both with an open source strategy).

Service & system overview: blocks marked in bold reflect high added value areas for users of DeFROST

With so much raw data freely available and more and more open source tools for its processing, the industry is booming!

The business model

Thanks to early stage engagements with potential customers willing to take part in the Feasibility Study (a key aspect of this ESA programme), we were able to iterate and test the most risky areas of our business model assumptions, in the “leanest” style possible.

Of course, the journey included a couple of pivots and many more iterations! Basically, the product(s) developed on top of the common technology stack are to be served via an API for advanced customers (charged per use), with ready to go “plug-in” integrations — as well as stand alone cloud-based service working out of the box (subscription model).

The developed prototype

During the study, a prototype thought to proof the most delicate assumptions of the future system and service was developed. For such prototype, we focused on one single terrain condition, of great interest for the tourism industry in our home country, Switzerland: snow (its extent, its depth and the related avalanche risk). In technical terms, the main challenges were those of processing the large amount of data, achieving the desired spatio-temporal resolution and, in addition, overcoming data gap challenges.

When we started our endeavour, many were the voices which highly doubted about the technical viability of producing daily snow cover extent data, at 30m resolution, daily — based solely on earth observation data. During the feasibility study, the following widely known challenges were given a plausible solution:

These challenges were specifically analysed by the team at ExoLabs GmbH, who we subcontracted for the Earth Observation study — Kudos to the team! Lessons learned: combine multiple sources of data, treat special cases separately and fill data gaps/tackle complex issues in data with machine learning… and you’re good to go!

In the case of Avalanche Risk and Snow Depth, we partnered with one of the global leaders in the field, the WSL Institute for Snow and Avalanche Research (SLF), based in Davos, Switzerland. With access to their 200 monitoring ground sensors in the Alps, we were able to identify and define specific work packages to increase the resolution of both of SLF datasets and, eventually, open them programmatically for commercial services via an API.

https://cloud.google.com/developers/startups/

The entire stack was set to run on the Cloud, specifically on a Kubernetes cluster hosted by Google Cloud Platform, who accepted WeGaw in its startup SURGE programme. Thanks to GCP and Kubernetes, the stack demonstrated the feasibility of processing the involved volumes of data with a smart use of IT resources (OPEX optimization) and is ready to scale.

The way forward is Big Data, ML & Open Source

The sheer amount of remote sensing, ground and weather data available are starting to harness new business applications of which DeFROST is just one of dozens yet to come up.

As “Space Data becomes the new Big Data”, with ESA equipping Earth Observation satellites with AI processors, and more and more open source remote sensing & ML algorithms and libraries being released in order to squeeze value from it (e.g. Let It Snow by CESBIO, s2cloudless by Synergise, Raster-Vision from Azavea and many more), it is clear the industry is set to go large due to the accrued ease of access and treatment of high quality data.

At WeGaw we are convinced that the value for our customers lies on the use cases the data is applied for in an integrated manner (domain expertise), as well as the provision of such capabilities “as a service”. As we move from MVP to Operational Product, we will support the remote-sensing and machine learning open-source communities by building on top of open sourced libraries and algorithms, contributing to their improvement as much as possible. Stay tuned for our upcoming releases!

--

--

Daniel Meppiel
Wegaw
Editor for

Building products powered by Satellite Data | CTO & Co-Founder at WeGaw