Once I had a dream of unlimited access to satellite network’s data. At first, I thought that it is something from science fiction or Tron, but then I realized it could become a reality, and that’s why we have started Starmesh.
“I tried to picture clusters of information as they moved through the computer”
My first dream came through having ☄️ a Fireball monitoring system for Earth. (2016). Quite unreal at that time, didn’t found any open data available, narrow customer base — B2G model.
Later, I found, ESA is up to something, and somebody nowadays is on that, ☄️ Fireball and Bolide Data from NASA. ESA even set a special mission, “The asteroid deflection mission,” called Hera, and I consider this highly crucial for Earth safety (Payam Banazadeh). I just realized we don’t have any mitigation system for asteroids, nor great global warning system if anything will go wrong tomorrow.
The ultimate vision of Starmesh is to democratize access to cryptographically verifiable earth observation data. We are aiming to unlock satellite data access to the decentralized financial networks of the future.
Can you imagine trading tokenized oil futures based on the earth observation data with an extracted value that is natively interoperable with the smart-contracts platform of your choice?
Technical Reality Check
When my imagination come again to my mind, four years later, I went to myself to explore tech, what has changed since then. There was an ecosystem of builders on top of Substrate. Parallel with that, IPFS has developed since then rapidly, and Filecoin launched the first testnet. The idea could become real. You may ask why?
To build a protocol that could handle massive geo datasets, we would need a fast consensus layer and an efficient storage layer. The emerging Substrate ecosystem fascinates us with great development tooling and scalability potential. Substrate based para-chains could have fast smart contracts execution and an ability to interact with external software via off-chain workers.
Substrate fits perfectly as consensus layer pretty well with all tunes, bells and whistles. We love it. We meanwhile built frontend, KodaDot, which will be reused as our entering frontend interface for accessing web3 world over Starmesh.
IPFS is excellent; it has all features we need, content-addressable storage, data structures (IPLD) and for bigger datasets, deduplication. Deduplication matters because you don’t want to pass 7 GBs of raw data around the globe if you can save bandwidth and store it closer to the customer, with location defined by the network itself.
IPFS is excellent, but it is missing an incentive layer is a thing. There are numerous centralized services like Pinata, Temporal, …, who are doing a great job here. They are doing a great job to store data in a distributed way across various data centres, but as any centralized infrastructure, it lacks transparency and suffers from incentive asymmetry. We miss some sort of crypto-economics involved. We wanted to be better and more distributed in terms of management of storage.
We went for Filecoin, and it’s Storage Market Deals. It is quite an impressive piece of software. It has phases like Discovery, Negotiation, Publishing, Handoff. It’s still amazing despite the missing Orderbook part that we aim to fill with Substrate smart contracts and potentially automated market makers.
We see the light at the end of the tunnel. We see a combination of Substrate, IPFS and Filecoin as the best match for the mission of Starmesh.
Open Satellite Markets
What goes with our motivation is open data by default from providers like ESA, NASA, JAXA. That changed in 2008, and in 2012 it went steep. As we noticed, datasets are not scarce, but rather processing and extracting value from them is creating value-added assets which are scarce afterwards.
It is not enough to find objects in images. The real value comes from finding items in the world at a known point in time.
This is the part where Starmesh fits in. We’ve noticed that actions for many GIS studios doing tailored case studies for customers are the same in one, all over going the same process within Machine learning. We would like to make this process easier and scale for more customers. With the leverage of web3 properties (various types of monetizations and incentives), this should go smoothly, without owning any infrastructure for Starmesh.
Satellite market economics has changed
What has changed the game was cheaper access to satellites in terms of getting data from orbit and getting on orbit. With Satellites as a Service (SataaS) (Loft Orbital Series A $13M), you don’t need to be a billionaire or government agency to launch your own space fleet into orbit; you can simply rent some of theirs. We see this as part of a more significant trend, as we will be shifting to off-world technologies. With numerous companies on orbit, it becomes a liquid market with more supply as usual, and we see a huge potential extracting lot of value from high-resolution data, literally by anyone.
Nowadays, images are processed from archives, which are eventually much cheaper than getting a realtime streamlined response. But it comes with its cost, massive archives.
Storage for archives could go quickly in size over 1PB, where 90% of this data is unused. Generally, public Earth Observation archives are growing by around 20TB daily, ESA Sentinel 1 & 2 combined 5TB per day. Speaking of mirrors, there are plenty of national mirrors, but who would run them long-term if they are not sustainable?
In terms of protocols, this is one thing where the geospatial community needs to collaborate more and adopt nowadays, web standards, useful approaches. We see the massive effort for WFS3.0, and protocol could be used in the industry widely anytime soon. We are excited about this progress as well as we sense benefits for Machine learning waters could emerge.
Now switch to realtime demand. Could you submit a request and get data in a matter of hours or minutes? With options like price variance from the cheapest provider at an exact location? What if you could buy and sell slots per request? At one point, there will be a queue of demand for satellite imagery. This could be resolved by surging price per slot or deploying more satellites; economic markets will decide what is more feasible in the long run.
Submitting order and getting an answer.
Satellite revisit period, it’s the time elapsed between observations of the same point on Earth by a satellite
Right now, to gather realtime information of change sits on revisit time. I think it’s not feasible for one company to have a network of satellites with high revisit time. However, Capella Space plans to have 1 hour revisit time with 36 satellites in 2023. ICEYE is on 18 satellites with 3 hours on average revisit around the globe. Other providers will eventually show up as well on orbit.
But what if you could combine them, could you get some sort of video of Area of Interest? Best fit from our perspective could be Livepeer, which could build contemplating ways to crypto-incentivize at the protocol level with probabilistic micropayments (credits goes to Chris Hobcroft)
Sub-meter resolution and cheap launches
With upcoming launches of constellations of Capella Space, Umbra, ICEYE and many others, all of them want to offer sub-meter resolutions, even better, 25cm. We sense this as an opportunity window as there will be plenty of supply and more likely expect demand for extracting information from particular and specific places. It will be the base stone for high-value insights, i.e. for retail investors. Neural markets could emerge to focus on the hunt on the edge of information. We are looking for a change, where traders could gain an advantage.
What do we want to bring to the market?
With growing numbers of satellites on the orbit, the costs of accessing data should go down. We would like to capture this and pipe it to the end customers. At some point, the sharing of data assets could happen.
Neural geospatial markets
We’ve learned there is a great movement, called mlhub.earth from Radiant.earth. We would like to create anyone-submit markets where you would pay for processing on the node with the help of an off-chain worker. We think the distribution of ML algorithms and processing could make federative learning more accessible, with the Plug’n’Play approach. The impactful applications will emerge. Incentive markets for the sort of digging and looking for valuable information from the changes could start their race.
With all archives of data and mirrors, there is missing traceability between copies. Why doing so, we could guarantee uniqueness that particular tile was not malformed, and you are 100% sure that you are looking at one specific piece. If you are secured at this point, you can build applications on top of it, like Defi apps for Earth Observation data.
Right now, there are not many such IaaS platforms powered by geospatial data to get critical insights for a particular industry, but there are few of them to get edge information for commodity traders. There you go, and you could grow data mining and create next market predictions based on real-world data.
Markets with high-value insights
We often see, not everyone has granted access to getting high-value insights until somebody will pay for processing is combined with the know-how for a particular industry. We would like to change this to underrepresented could benefit from acumens.
Our favourite next move will be data rental. Imagine you can get data for your machine learning process for a fraction of the cost.
Where is the catch? Simply you can book requests for tile and share data with someone else or later access to your requested data would not be allowed. Simple, because you don’t need that data anymore.
With all this blockchain comes in, where you can protect the ownership of the shared data, provide accountability for item use and maintain the integrity of the resource pool.
Incentives in community collaborations
Once you have a transparent and accountable layer, incentives could be set by anyone. That could spark a network effect of usage. I.e. someone could set bounty to extract valuable information from the maritime segment for transportation.
What we want to be, and why?
We’ve realized that getting your model trained for geodata, needs labelling in the first place. For labelling, you need to go through a lot of images to mark tankers as a tanker and not submerged rocks. Then you need to put training in place.
Trained models are alpha for a lot of remote sensing studios. We account from companies selling solar data, providing information edge for oil traders, monitoring stockpiles, property risk intelligence they are opening flood gates for a whole new level of business opportunities.
The revenue model, which is popular in this industry, is a subscription or usage fee. That’s the most popular model among web2.
We want to leverage web3 revenue primitives as fee sharing, token curated registries, decentralized autonomous organizations, on-chain proposals.
We perceive Starmesh something between Data-as-a-Service (DAAS) — where we will provide smooth, timely and protected access to raw of processed Earth Observation data (space-borne, air-borne, in-situ) through Platform-as-a-Service (PAAS) — where we will provide users with the essential environment including tools and software where they can discover, visualize and process Earth Observations data till Information-as-a-Service (IAAS) — where the output will be reports, maps or business intelligence extracted from the analysis and processed from Earth Observation.
Internet of value-added geospatial information.
Our primary mission is to be something like the Internet of value-added geospatial information.
From there, products could be built. Aggregate satellite data with other information sources, be the place for making various luxurious products and having rich spatial data from multiple sources, all from one convenient location.
Labelling interfaces with incentives will create another self-sovereign market.
Training your machine learning models with off-chain workers, shared scripts, distribute them to clients.
Sharing trained models, buying and trading trained models. Building analytical insights to gain new information edge from data we have already, but the value-added asset is not yet extracted from them.
Distribution of data sets is crucial for building and extracting new values. This is where we want to create a paradigm shift from centralized data centres (AWS, GCP, Azure, DO) to distributed networks. We are starting with leveraging IPFS and Filecoin for all this. We plan to introduce an economic model with incentives that storing datasets would be beneficial for every participant.
Analytical outputs that could be descriptive, i.e. describe the present situation or predictive, i.e., predict the future status.
There we can have a layer of data to build various forecasts and predictions. Even predictions markets will now have an impressive fit.
We want to set a new game theory for extracting value from remotely sensed data. We want to become next-generation Earth Observation meta-vehicle.
We are at the begging of our adventure. We’ve just moved out from the phase of ideation and research to setting up a project and start building. We’ve already started building our Proof of Concept with Substrate and Filecoin.
If you like what we want to build and stay updated, feel free to follow us on twitter.com/Starmesh1
This article was assembled early in Q1 2020. Since my first ideation back in October 2019, we’ve surfed through a lot of problems in the Earth Observation industry. We also applied to the ESA incubator, but we’ve missed the business model. Of course, they are sort of legacy industry, and we want to aim for web3 waters as we see there emerging surface for us.
We’ve had reviews from capitals and hedge funds; the collective result was we are too early in the market for this.
The question is, how would you detect the right timing?
I would like to thank anyone who participated in the project, namely Alexandra, Arseny, Denis, Hitesh, Martin & Viktor.
Meanwhile, we’ve pivoted to do interface for neural markets dedicated to Earth Observation imagery.