Q1 Airbloc Technology Development Update

Ease-of-Integration with Enterprise Data Systems and Improved On-Chain Scalability of Data Exchange Process.

Development updates can be dry so here’s what you need to know about Q1’s tech update in 30 seconds!

  1. Airbloc’s data architecture handling processes have been revamped to significantly improve the scalability of data registration and data exchange with Merkle User Dataset (MUD) technology. Cheers for data registration and exchange scalability!
  2. Airbloc’s revised data architecture and back-end data pipelines are now capable of being integrated with enterprise data systems to facilitate enterprise-to-enterprise data exchange at scale. Cheers for ease-of-adoption!
  3. We have redesigned the way we collect and categorize data to allow enterprises to register more granular types of user data. This allows Airbloc to collectively group incoming data under specific user profiles with 3 broad categories of Profile Data, User Events Data, and Identity Data, rather than randomized and uncategorized data types. More data types = more Airbloc use-cases.
NOTE: The term “Enterprises” is used to refer to companies possessing personal data who are potential data providers or consumers on Airbloc’s consent-based real time data exchange platform.

Curious about how we did it? Read on to find out more!


ENABLING enterprise data registration at S-C-A-L-E with Merkle User Dataset Architecture implementation

The process of registering data on the blockchain using Merkle User Dataset

To drastically improve the data registration efficiency, we developed Merkle User Dataset (MUD) technology. MUD adopts a Sparse Merkle Tree (SMT) and a Nested Merkle Tree structure, which enables enterprises to register their data by simply uploading the Merkle Root only on the blockchain.

  • Our previous data structure required enterprises to upload the entire list of the data onto the blockchain (which would have exerted tremendous stress on the blockchain), but MUD now has the benefit of allowing data providers potentially bundling millions of user data into 1 SINGLE Merkle Root in byte-sized data.

EXPANDING the types of data available for exchange through Data Architecture Refactoring

  • We are building an entirely new data architecture — one that is user-profile-driven compared to the previous architecture that was randomized and uncategorized. This new architecture allows Airbloc to collectively group incoming data under specific user profiles with 3 broad categories of Profile Data, User Events Data, and Identity Data, rather than randomized and uncategorized data types.
  • This new architecture also allows individuals to control their data usage at a more granular level and companies can manage and exchange more various kinds of data sets.

SIGNIFICANTLY improved data registration performance and speed by 40x

  • Airbloc now relies on Parallel Data Processing rather than Single-Threaded Processing in its data collection pipeline. We conducted a private test of our new data collection processing capabilities with our enterprise partners in Q1, and found that data registration speed improved drastically by 4000%.
  • You can find more information on our Github.

ENABLING real-time dynamic data exchange for enterprises with Data Exchange API implementation

  • We have introduced API data exchange as an additional method of data exchange. Before the introduction of API data exchange, Airbloc only permitted the exchange of static user attributes or datasets. With this new feature, it has the benefit of allowing dynamic and real-time and up-to-date data to be exchanged. This feature was introduced as it is more common for enterprises to exchange data via API, and hence easier for Airbloc to interoperate with existing enterprise data infrastructure.

ALLOWING Enterprises to Customize Their Data Purchase Request with Data Discovery API Implementation

  • We have integrated a Data Discovery API in Airbloc’s exchange platform to allow enterprises to search, build, and customize the data they wish to purchase, and receive a list of curated data available for purchase from a particular user segment.

ENSURING Consent-Based Data Exchange through Data Controller Node Implementation.

  • We have introduced a new Data Controller Node to monitor and ensure that data exchange can only occur when there has been expressed consent approval provided by individual users. This node also ensures that there are mechanisms in place to penalize actors that register and exchange data without proper user consent.
Data controller node conducting a verification to ensure that data registered on Airbloc has been expressly consented by users.