Building Blocks of the Raven Distribution Framework on Github
We are excited to announce the public beta release! The Raven Distribution Framework is open-sourced on Github. All developers are welcomed to run the code, fork it, watch it, star it, and best of all contribute to the project!
What we have built and open-sourced as of now is the foundation for any Machine Learning or Deep Learning Framework. Simply put, it is more like a decentralized calculator, comparable to a decentralized version of the IBM machines the were used to launch the Apollo astronauts. Apart from building ML/DL frameworks, a lot more can be done on it, such as maximizing yield on your favorite defi protocols like Compound and more!
The distribution framework itself is what allows us to solve the core use case of the speed of AI training. We are excited to see what others will create on top of our building blocks. It could be bigger than AI/ML algorithms :)
With private beta customers, we worked closely with them on customized solutions. Along the way we discovered another key learning: users wanted to start developing and using their own algorithms. With this open source public beta release, we welcome contributors to the project to continue developing algorithms on the framework.
Getting Started
Clone the open source repository
Pull down the latest code in our Github:
https://github.com/ravenprotocol/raven-distribution-framework
~/dev> git clone git@github.com:ravenprotocol/raven-distribution-framework.git
Starting the Raven Server
Create a Docker image, create a container, and start the server using setup.sh in ./raven-distribution-framework.
~/dev/raven-distribution-framework> bash setup.sh
Starting the Raven Client
Start up a Raven Client (node), which performs units of computation, using index.html.
~/dev/raven-distribution-framework> open index.html
Your development environment now has a Raven Server and a Raven Client up and running.
Operators supported are:
1. Matrix Multiplication
2. Addition
3. Subtraction
4. Element-wise-multiplication
5. Division
6. Linear
7. Negation
8. Exponential
9. Transpose
10. Natural Log
Testing out the Computations
View our demo app run_app.py with simple backend algorithms. It connects with the Raven Server which distributes the computations to Raven Clients.
~/dev/raven-distribution-framework> python3 run_app.py
Let’s start with a simple addition operator
Select 2. Addition Operator, enter 2 values to add, and see the computation in action through the whole framework.
Raven Client receives the addition operation and values to compute
Raven Server receives the result of computation and propagates it back to the demo app. The result of 3 + 5 is… 8!
Now we can verify the results of the computation.
Result: 8
Let’s try a more complex operation like Matrix Multiplication to show off the power
Select Operator #1 Matrix Multiplication
~/dev/raven-distribution-framework> python3 run_app.py...Enter operator number here: 1Enter the first value(Scalar, Array or Matrix): [[2,4,5], [5,6,7]]Enter the second value(Scalar, Array or Matrix): [[7,3,4,6],[4,5,6,2],[5,4,6,7]]Computing...
Confirm that the Raven Client has received the Matrix Multiplication operation with the correct matrices for compute
You can see here that the Raven Client has received [[2,4,5], [5,6,7]] and [[7,3,4,6],[4,5,6,2],[5,4,6,7]]. The client then starts to perform the Matrix Multiplication operation.
Raven Server receives the result of computation and propagates it back to the demo app
Now you can verify the results of the matrix multiplication are correct
Result: [[55, 46, 62, 55], [94, 73, 98, 91]]
Imagining the Future
Most operations while training a neural network require some form of matrix multiplication. The Raven Server you got up and running just distributed a matrix multiplication operation to the Raven Client (node) along with values to compute. That independent node successfully did matrix multiplication and sent the result of the computation back to the server/demo app. Ladies and gentlemen, welcome to decentralized and distributed deep learning training!
We are at the early stages of building a network of compute nodes for AI/ML training where speed is the key. It has never been done before. With today’s public beta open-sourced, please imagine all the possibilities this can unlock.
Raven Protocol: Q2 2019 Tech and Community Update:
https://medium.com/ravenprotocol/q2-2019-tech-and-community-update-4f836a9a1e97
Raven Protocol: Q3 2019 Tech project development Update:
https://medium.com/ravenprotocol/tldr-raven-stayed-heads-down-building-in-q3-2019-ae5f242dc15d
Raven Protocol: Q4 2019 Tech project development Update:
https://medium.com/ravenprotocol/happy-2020-an-important-update-from-the-raven-team-67b5e88e1b1d
Raven Protocol: Q1 2020 Tech project development Update:
https://medium.com/ravenprotocol/q1-2020-success-during-hard-times-3dcdbb0faba8
Raven Protocol Project Review:
https://cryptocalibur.com/portfolio-item/raven-protocol-review
Raven Protocol White Paper:
https://drive.google.com/file/d/1FAaVKkg_CjxMj-n1yHZc6ufcVDtOU1Ct/view?usp=sharing
OFFICIAL CHANNELS:
Official Email Address: founders@ravenprotocol.com
Official Website Link: http://www.RavenProtocol.com
Official Announcement Channel: https://t.me/raven_announcements
Official Telegram Group: https://t.me/ravenprotocol
Official Twitter: https://twitter.com/raven_protocol
Official Medium: https://medium.com/ravenprotocol
Official LinkedIn: https://linkedin.com/company/ravenprotocol
Official Github: https://www.github.com/ravenprotocol