Raven Protocol Partners with Ocean Protocol to Bring Federated Learning to Ocean Market via Compute-to-Data

Raven Protocol
RavenProtocol
Published in
4 min readMay 31, 2021

Raven Protocol is expanding our use cases and we are quite honored to announce that we are partnering with Ocean Protocol! We will become a Compute Provider in Ocean Compute-to-Data and will be publishing a range of algorithms on Ocean Market, from Machine Learning to Federated Analytics and of course the holy grail — Federated Learning.

Why Compute-to-Data?

One important question that our community often asks is, “How do you handle privacy and security of the training data?” Although we are working on various encryption techniques, the topic is so important that we’ve even had developers volunteer to contribute to the privacy/security of the training protocol.

But what if there was a data provider that already handled the privacy of the training data? At Raven, we’ve been following Ocean Compute-to-Data quite closely since it launched last year. Compute-to-Data resolves the tradeoff between the benefits of using private data, and the risks of exposing it. It lets the data stay on-premise, yet allows 3rd parties to run specific compute jobs on it to get useful compute results like averaging or building an AI model.

A community member in our Raven Developer Community Discord has great instincts :)

Ocean Compute-to-Data works as follows.

  • First, Data Providers approve AI algorithms to run on their data. These algorithms could be published by Raven Protocol or other third parties.
  • Then, Compute-to-Data orchestrates remote computation and execution on data to train AI models. The remote computation could be handled by the Raven Network or other Compute Providers approved by the Data Provider.

At Raven, we believe that distributed computing is the future of computing. Our framework RDF is a distributed and decentralised compute engine, consisting of various libraries like RavOp, RavML and RavDL. As we are inching towards our goal, our partnership with Ocean is going to be a powerful collaboration providing value to our contributors, Ocean’s data-providers and clients. We are super excited to be working with Ocean’s team.

— Kailash Ahirwar, Co-founder of Raven Protocol and author of Generative Adversarial Networks Projects: Build next-generation generative models using TensorFlow and Keras

Raven Protocol Enhances Ocean Compute-to-Data

In Ocean, a Compute-to-Data infrastructure is set up as a Kubernetes (K8s) cluster e.g. on AWS or Azure in the background. This Kubernetes cluster is responsible for running the actual compute jobs, out of sight for marketplace clients and end users. While this is an incredible feat in itself, users and Data Providers may want an alternative option in the Compute Providers they choose to approve. The spirit of decentralization may be a philosophical choice for some, but a strict requirement for others. Raven Protocol provides the decentralized option when choosing a Compute Provider.

On top of that, Raven provides an additional layer of privacy for Ocean Compute-to-Data. We mentioned that we will be publishing Federated Learning algorithms. A neural network is randomly initialized. Weight updates are computed next to the data itself in a data silo and then sent to the neural network. This is repeated in data silo #1, data silo #2, data silo #3, and so on. A neural network gets trained across many data silos without data leaving the premises of each respective silo. The Raven Distribution Framework enables this in Compute-to-Data.

AI is at the heart of what Ocean Protocol does, so we’re very excited by this partnership. Ocean’s mission is to democratize access to data sets and AI capabilities; this aligns perfectly with Raven’s mission to provide cost-efficient and faster training of deep neural networks using a decentralized and distributed network of compute nodes. Together we are one step closer to unlocking the Open Data Economy.

— Razvan Olteanu, Chief Operating Officer at Ocean Protocol

About Ocean Protocol

Ocean Protocol’s mission is to kickstart a Web3 Data Economy that reaches the world, giving power back to data owners and enabling people to capture value from data to better our world.

Data is a new asset class; Ocean Protocol unlocks its value. Data owners and consumers use the Ocean Market app to publish, discover, and consume data assets in a secure, privacy-preserving fashion.

Ocean datatokens turn data into data assets. This enables data wallets, data exchanges, and data co-ops by leveraging crypto wallets, exchanges, and other DeFi tools. Projects use Ocean libraries and OCEAN in their own apps to help drive the Web3 Data Economy.

The Ocean token is used to stake on data, to govern Ocean Protocol’s community funding, and to buy & sell data. Its supply is disbursed over time to drive near-term growth and long-term sustainability. OCEAN is designed to increase with a rise in usage volume.

Visit oceanprotocol.com to find out more.

Twitter | LinkedIn | Blockfolio | Blog | YouTube | Reddit | Telegram | Discord

About Raven Protocol

Raven Protocol is developing a decentralized and distributed network of compute nodes for Artificial Intelligence and Machine Learning. Their goal is to provide cost-efficient and faster training of deep neural networks.

Raven’s self-sustaining and dynamic ecosystem is for Customers who want to train their AI engines; and/or Contributors who would like to share their compute resources in the form of Computers, Smartphones, or even a server rack.

Raven Tokens (RAVEN) will work as the common ground to facilitate a secure transaction that will take place inside our ecosystem. Customers who want to rent compute power will do so with RAVEN and contributors of the compute power will be rewarded in RAVEN.

Visit ravenprotocol.com to find out more.

Twitter | LinkedIn | Blog | Substack | Reddit | Telegram | Discord | Github

--

--

Raven Protocol
RavenProtocol

www.RavenProtocol.com is a decentralized and distributed deep-learning training protocol. Providing cost-efficient and faster training of deep neural networks.