Introducing Phase 2 — Managed dAPIs

Ugur Mersinlioglu
API3
Published in
13 min readAug 16, 2023

After a long wait, we are excited to announce the launch of managed dAPIs — the second part of our four-phased dAPI rollout. This article is going to dive into all of the nuances of managed dAPIs and as such is going to be quite a lot to unpack. In case you don’t have the time to dive into all of the details or simply want to get to the gist of this announcement here is a short TLDR:

Managed dAPIs are now available across all available mainnets on the API3 Market. Self-funded dAPIs are now upgradable to on-chain aggregated, transparent, and verifiable managed dAPIs by making payments in the native currency of the chain that the respective dAPI is operating on.

On-chain values are maintained by the API3 DAO with additional API providers acting as fallbacks, allowing developers to consume values directly without the need to worry about gas costs or running and maintaining any additional oracle related infrastructure.

All dAPIs serve data directly from reputable API providers operating first-party oracle nodes that cryptographically sign their responses at the source level. This means that on-chain values can only be updated in a tamper-proof way with valid signatures from the source of the data, providing verifiable source level decentralisation.

With that out of the way, let’s dive into the technical breakdown of managed dAPIs.

What is a dAPI?

It has been quite a while since the release of the initial dAPIs article by Burak in early 2022, so let me start off with a little refresher of what a dAPI actually is. In essence a dAPI is nothing more than a pointer, akin to how an ENS name points to a certain Ethereum address, a dAPI like ‘ETH/USD’ points to a certain data feed.

There are two types of data feeds that a dAPI can point towards — either a beacon or a beacon set. A beacon is a data feed that is created by calling certain parameters from a singular API provider running a first-party oracle through hosting an Airnode. The beacon is addressed by a Data Feed ID that is calculated through the hash of the airnodeAddress and the request parameters (like ETH/USD) referred to as the templateID. This means that each asset pair, like ETH/USD, for each provider has a definitive Data Feed ID that it can be addressed by and dAPIs can be pointed to specific providers if need be.

This isn’t how hashing works, but a simple explainer.
ETH/USD dAPI pointing to Nodary

dAPIs can also point to aggregated data feeds, which are called beacon sets and are also addressed by a Data Feed ID. Compared to singular beacons, the Data Feed ID of a beacon set is calculated through the hash of the underlying beacons. That’s a bit of a mouthful, so let me try in simpler terms:

For instance, to build a ETH/USD beacon set, we can utilise the following beacons:

  • Nodary’s ETH/USD Beacon
  • TwelveData’s ETH/USD Beacon
  • NewChangeFX’s ETH/USD Beacon
  • dxFeed’s ETH/USD Beacon
  • Kaiko’s ETH/USD Beacon
  • Finage’s ETH/USD Beacon
  • Coinpaprika’s ETH/USD Beacon

The resulting Data Feed ID represents the aggregated median value of the ETH/USD data feeds. Importantly, this aggregated Data Feed ID can only be updated through the values from its underlying beacons. If we then point the ETH/USD dAPI towards this aggregated Data Feed ID, users will read the median value for ETH/USD as derived from all seven sources.

ETH/USD dAPI pointing to a Beacon Set

The core benefit of having dAPIs is that adjustments to data feeds can be made while not requiring any action from consumers. For instance, if one of the oracle nodes is giving severely wrong answers or is completely offline for a significant amount of time, a new beacon set without this specific provider can be created and the dAPI simply pointed towards the Data Feed ID of this new aggregation. Since users are reading the product through the dAPI, they will automatically consume the new aggregation once the dAPI is set to the new Data Feed ID.

Self-funded vs Managed dAPIs

Now that we’ve gone over what exactly a dAPI is, it’s time to dive deeper into the types of dAPIs. Back in March we launched self-funded dAPIs, which are dAPIs that are pointing to a singular beacon. Anyone can bring these to life by depositing funds into a dAPI specific sponsor wallet and anyone using these data feeds is also solely responsible for keeping the wallet topped up so that sufficient funds are available for the dAPI to be continuously updated.

While this product makes it very easy for anyone to have seamless access to oracle services, it comes with two distinct disadvantages. First, self-funded dAPIs are pointing towards a singular beacon and are hence susceptible to potential downtime or misreporting of a single entity. Second, the gas management overhead lies with the consumer, which can sometimes be challenging as they might forget to keep dedicated wallets topped up, despite the availability of tools like Gelato Network and respective guides.

Managed dAPIs are designed to tackle these pain points. Instead of pointing towards a singular beacon, they point to a beacon set and are thus less susceptible to individual misreporting as well as downtimes. Additionally, the gas management overhead previously handled by the consumer is taken over by the API3 DAO as a managed service. To allow for this, a payment in the native currency of the chain that the dAPI is operated on is required to upgrade a self-funded dAPI into a managed one for a certain period of time.

How are values maintained on-chain?

In recent months there have been significant debates around oracle designs and some interesting discussions in a lot of governance forums (e.g. AAVE). May it be Push vs Pull, Bridge vs no Bridge, or First-party vs Third-party. While this article isn’t really the place to continue this discussion it’s important to highlight what we categorise managed dAPIs as:

First-party oracles that transparently maintain data directly on-chain according to parameters like deviation and heartbeat by utilising tamper-proof signed data.

Each API provider is running multiple instances of Airnode, the API3 native first-party oracle node. Airnode comes with a signed HTTP-gateway that allows anyone with access to receive tamper-proof data. For this, another tool called Airseeker is utilised which fetches signed data from the respective first-party oracles HTTP-gateway used for a specific data feed and compares these values with the current on-chain values. Given that certain parameters are met, like a 1% deviation between these two values, an update is triggered.

Airseekers are run by multiple entities in order to maintain on-chain values according to the parameters purchased by consumers. At the first level the dAPIs team of API3 is running Airseekers as a primary method of updating data feeds. As a secondary and fallback layer select API providers are running Airseekers which will trigger updates in case the primary method fails. Additionally, we’re also in advanced talks with Gelato Network in order to introduce their Web3 Functions as a tertiary layer which will further decentralise our utilised Cloud and RPC infrastructures.

The benefit of this design is that consumers of API3’s decentralised data feeds and chains who want oracle services are not in any way required to run infrastructure for it. Multiple entities are ensuring that all that consumers have to do is to simply read data that is directly available on-chain.

Don’t trust, verify.

If you’ve been around in crypto long enough, this is probably an expression that you’ve heard before. The beauty of blockchains lies in their transparency — if you have the skills to navigate explorers and contracts, you don’t need to rely on others. Everything that you need to know is at your fingertips. Somehow this ethos is thrown out of the window when it comes to oracles though. May it be upgradable proxy contracts, unknown and overly powerful multi-sigs, or opaque data sourcing. The most commonly adopted solutions nowadays are nothing more than what I’d define as “trust-traps”:

  1. You trust that contracts don’t get upgraded to do something malicious.
  2. You trust that multi-sigs don’t get abused or compromised.
  3. And most importantly, you trust that the data sources that are claimed to be used are actually used.

The last one is the one that baffles me the most. While upgradable contracts and overly dangerous multi-sigs are somewhat known of, “accepted” and “visible” on-chain, nobody can actually verify that oracle services are using the data sources they claim they are. If an oracle project tells you that they are using at least five data sources, how can you confirm this independently without trusting their word? This is where the blockchain industry suddenly turns from “Don’t trust, verify” into “Trust, don’t verify”.

The recent case of Multichain is a brilliant example of this, where claims were made that numerous aspects of the bridging service are decentralised, while effectively being in control of a single person. Many, including the likes of Andre Cronje, admitted that it was on them for not verifying and not being able to verify these claims of ‘decentralisation’. — Cointelegraph

https://forum.fantom.network/t/andre-cronje-infinite-ama/158/327

This incident and the numerous others in the past years have proven that nothing is too big to fail and that you can trust but should always be able to verify. That’s what blockchains are here for. This is why API3 is setting a new standard for how dApps access data on-chain. You can, at any given point and independently of API3, prove where the data you are consuming is coming from. You don’t need to blindly trust any claims we’re making about data origin, since all the information needed is provided to you on-chain as well as by the first-party oracles directly.

A dAPI always points to a beacon or a beacon set. As stated previously, these are maintained through first-party oracles, which means that the oracle is the data source itself. The first-party oracle is identifiable through their airnodeAddress. All that is needed to prove data origin is to confirm who this airnodeAddress belongs to. The API providers that API3 works together with prove ownership of their airnodeAddress by creating an entry in their DNS records with the details of their airnodeAddress.

Twelvedata Airnode Verification with DNS record from DigitalOcean

There is a guide available that runs you through the process of verifying an Airnode address as well as another guide that explains how to verify the sources used within a specific dAPI. With these things in place anyone can, at any given time, verify data sourcing, with the additional benefit of also being able to verify the decentralisation of data sourcing.

What do data feed services cost?

This is a topic that people know very little about and also a topic that rarely gets talked about. One of the reasons for this is that in the past there was nothing to talk about since data feeds were heavily subsidised by oracle projects to gain adoption, but there has been a significant pivot in recent months. The costs of operating data feeds don’t simply disappear and selling huge amounts of your own token to subsidise data feed services across the ever growing landscape of L2 and alternative L1 networks just isn’t viable anymore. As such, you can now observe that chains desiring data feed services or dApps that want to deploy on certain chains, are being asked to pay up.

Oracle projects are approaching this challenge in various ways, which could be debated at length, but let’s talk about what API3 is doing with managed dAPIs. If you’ve been following API3 you know that we’ve been heavily invested in Oracle Extractable Value (OEV) and that we’re building towards a solution that will allow dApps to tap into the value they are currently leaking to more strategic third-parties due to current oracle designs. This service is called OEV-Share and is going to be the method of monetisation for API3, which we will explore in future articles. What this means for chains and dApps wanting data feed services from us is that we do not plan on making any money through our data feed offerings.

However, this doesn’t necessarily mean that managed dAPIs are free. One unavoidable expense is the gas costs associated with putting data on-chain. Our goal with managed dAPIs is to charge as much as is required to operate the dAPIs at the desired specifications, or in other words, we simply want to break even. To accomplish this, we’re carefully tracking and analyzing gas consumption and tailoring our approach for each individual chain and dAPI.

The result is a managed service that allows anyone to upgrade a dAPI for a specific duration by making payments to API3 in the native gas token of the desired chain. How much we are charging depends on the chain as well as the dAPI itself, but in general takes into account the subscription length, the typical asset volatility, and the gas price volatility that API3 takes on for the time of the subscription.

To minimize risk and also charge consumers as little as possible (since the goal is to break even), prices are calculated on a bi-weekly basis to reflect asset and chain conditions in near real time. Subscription times are also fixed and differ on a chain by chain basis depending on how risky they are considered to be for API3 to operate on. For instance, on chains like Polygon we’re comfortably going to offer subscription times of 6 months into the future, whereas on Polygon’s zkEVM it will only be possible for 3 months at a time.

One important factor to consider in all of this is that we only charge once per dAPI for the specific subscription time. Our contracts do not have any access control associated with them, which means that once an upgrade to a managed dAPI is completed and data is maintained on-chain, anyone can read it for free. This approach makes it possible for chains to take over data feed costs and offer all developers building on their chain access to free data feed services in order to promote the growth and development of their DeFi ecosystem. It also allows dApps to come together and share costs for specific data feeds they require.

How can you use managed dAPIs?

All of the available dAPIs that API3 offers can be found on the API3 Market. As a base state every dAPI will be available as a self-funded feed but is upgradable to a managed data feed directly over the API3 Market. To accomplish this you simply have to go to the dAPIs that you want, select your desired specifications and add them to your cart. You can continue adding as many dAPIs on the same network as you like, but note that we don’t allow mixing dAPIs from different networks in the same order. Once you’re happy with your cart you can simply proceed to payment, which will create an order. Orders are visible to everyone in the orders tab and are also payable by anyone. You can simply pay the order yourself or send a link to anyone responsible for payment. Once paid API3 will work on upgrading the dAPI within 5 business days. If you’ve been consuming the respective dAPI that is being updated in a self-funded state, you will automatically benefit from the upgrade without any required action or costs associated.

We’ve created a tutorial that runs through the entire upgrade process:

Once your desired dAPI is available in a managed state, you consume it in the same way that self-funded dAPIs are consumed. We recommend reading dAPIs through proxy contracts, which you can deploy directly over the API3 Market. There is a guide available that walks you through all of the aspects of reading dAPIs in our docs. Proxies are set on a dAPI basis per chain, which means that if there is already a proxy deployed, there is no need to deploy another. The API3 Market will display if a proxy is already available for your specific dAPI.

Proxy Contract Addresses as shown on the API3 Market

Once in managed state a dAPI is also displayed differently on the API3 Market. You will be able to confirm its state, see all providers powering the feed, the parameters for each provider, the expiry date, as well as have a graph with historic values.

Managed dAPI on the API3 Market

Efficiency, Transparency, and Verifiability.

Managed dAPIs give chains and developers the ability to make use of data feed services that are efficient, transparent and verifiable. This is mainly enabled through our first-party architecture that allows us to create efficient services by cutting out rent-seeking middlemen, and deliver the same experience that developers are used to in a much cheaper and more transparent way. Our architectural choices also enable the verifiability of data sourcing, which ensures that service consumption is not based on paper promises and trust but rather hard and provable facts. The verifiability also gives developers, chains, and users the ability to ensure that the service they are consuming is truly decentralised and doesn’t only claim to be.

Managed dAPIs are currently available across 11 mainnets and if a specific dAPI you desire isn’t available yet, you can simply request it. Chains that want our managed dAPIs can reach out to us at any time over our Discord. Despite the fact that we’ve already got quite the pipeline of upcoming chain deployments lined up, we’ll make sure to include you as soon as we can.

For anyone wanting to follow the progress of managed dAPIs, all orders for them can be tracked on the Orders tab of the API3 Market or the dAPI monitoring page. For more information about API3 and managed dAPIs, visit our website, the API3 Market, follow us on Twitter, or explore our technical documentation.

--

--