RavenProtocol
Published in

RavenProtocol

Building Blocks of the Raven Distribution Framework on Github

We are excited to announce the public beta release! The Raven Distribution Framework is open-sourced on Github. All developers are welcomed to run the code, fork it, watch it, star it, and best of all contribute to the project!

What we have built and open-sourced as of now is the foundation for any Machine Learning or Deep Learning Framework. Simply put, it is more like a decentralized calculator, comparable to a decentralized version of the IBM machines the were used to launch the Apollo astronauts. Apart from building ML/DL frameworks, a lot more can be done on it, such as maximizing yield on your favorite defi protocols like Compound and more!

The distribution framework itself is what allows us to solve the core use case of the speed of AI training. We are excited to see what others will create on top of our building blocks. It could be bigger than AI/ML algorithms :)

With private beta customers, we worked closely with them on customized solutions. Along the way we discovered another key learning: users wanted to start developing and using their own algorithms. With this open source public beta release, we welcome contributors to the project to continue developing algorithms on the framework.

Getting Started

Pull down the latest code in our Github:
https://github.com/ravenprotocol/raven-distribution-framework

Create a Docker image, create a container, and start the server using setup.sh in ./raven-distribution-framework.

Start up a Raven Client (node), which performs units of computation, using index.html.

Your development environment now has a Raven Server and a Raven Client up and running.

Operators supported are:

1. Matrix Multiplication
2. Addition
3. Subtraction
4. Element-wise-multiplication
5. Division
6. Linear
7. Negation
8. Exponential
9. Transpose
10. Natural Log

Testing out the Computations

View our demo app run_app.py with simple backend algorithms. It connects with the Raven Server which distributes the computations to Raven Clients.

Let’s start with a simple addition operator

The values 3 and 5 have been sent to the Raven Client (node) for the addition computation.
Having a peak into the developer console shows that the Raven Client is connected to the Raven Server. It has received values 3 and 5 to run the addition computation on this node. The result of the computation is sent back to the server.

Now we can verify the results of the computation.

Demo app receives the result of the computation from the Raven client (node).

Let’s try a more complex operation like Matrix Multiplication to show off the power

The matrices [[2,4,5], [5,6,7]] and [[7,3,4,6],[4,5,6,2],[5,4,6,7]] have been sent to the Raven Client (node) for the matrix multiplication compute.

You can see here that the Raven Client has received [[2,4,5], [5,6,7]] and [[7,3,4,6],[4,5,6,2],[5,4,6,7]]. The client then starts to perform the Matrix Multiplication operation.

Javascript on the client side successfully computes the matrix multiplication and sends it back to the server.

Now you can verify the results of the matrix multiplication are correct

Boom, matrix multiplication forms the basis of neural networks!

Imagining the Future

Most operations while training a neural network require some form of matrix multiplication. The Raven Server you got up and running just distributed a matrix multiplication operation to the Raven Client (node) along with values to compute. That independent node successfully did matrix multiplication and sent the result of the computation back to the server/demo app. Ladies and gentlemen, welcome to decentralized and distributed deep learning training!

We are at the early stages of building a network of compute nodes for AI/ML training where speed is the key. It has never been done before. With today’s public beta open-sourced, please imagine all the possibilities this can unlock.

Raven Protocol: Q2 2019 Tech and Community Update:
https://medium.com/ravenprotocol/q2-2019-tech-and-community-update-4f836a9a1e97

Raven Protocol: Q3 2019 Tech project development Update:
https://medium.com/ravenprotocol/tldr-raven-stayed-heads-down-building-in-q3-2019-ae5f242dc15d

Raven Protocol: Q4 2019 Tech project development Update:
https://medium.com/ravenprotocol/happy-2020-an-important-update-from-the-raven-team-67b5e88e1b1d

Raven Protocol: Q1 2020 Tech project development Update:
https://medium.com/ravenprotocol/q1-2020-success-during-hard-times-3dcdbb0faba8

Raven Protocol Project Review:
https://cryptocalibur.com/portfolio-item/raven-protocol-review

Raven Protocol White Paper:
https://drive.google.com/file/d/1FAaVKkg_CjxMj-n1yHZc6ufcVDtOU1Ct/view?usp=sharing

OFFICIAL CHANNELS:
Official Email Address: founders@ravenprotocol.com
Official Website Link: http://www.RavenProtocol.com
Official Announcement Channel: https://t.me/raven_announcements
Official Telegram Group: https://t.me/ravenprotocol
Official Twitter: https://twitter.com/raven_protocol
Official Medium: https://medium.com/ravenprotocol
Official LinkedIn: https://linkedin.com/company/ravenprotocol
Official Github: https://www.github.com/ravenprotocol

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Raven Protocol

www.RavenProtocol.com is a decentralized and distributed deep-learning training protocol. Providing cost-efficient and faster training of deep neural networks.