Q1 2020: Success During Hard Times

Raven Protocol
RavenProtocol
Published in
5 min readApr 13, 2020

There is no doubt the world is going through interesting times. While many projects/startups have either shut down, parted ways with critical employees, or failed to find product-market fit, we are lucky to be here fighting with plenty of ammo. In fact, we are hiring AI/ML engineers and business development managers as we prepare for beyond 2020.

Let’s have a recap of what we accomplished in Q1 and look at where we are today. Then we will let you imagine how that will play out in the future!

Blockchain Implementation on Binance Chain

We silently launched a “Coming Soon” page for our Proof of Steak program. We did not announce anything publicly. Our community figured everything out on their own. What is it exactly that they figured out? Well, first of all no rewards are being distributed at the moment 🤣

Thank you so much to the THORChain team for helping us kick off the staking program. They graciously offered us to fork their frontend code! They are a true testament to Binance Chain companies helping Binance Chain companies. The way this team works will carry them very far. In fact, they are already very far. Over 90% over their supply is staked 🤫

Just kidding. You may remember the part in our whitepaper about website partners. Users of those websites can potentially do a small unit of calculation (AI/ML training) and earn RAVEN. The Proof of Steak program is an implementation of just that. It is only in testing so no rewards are distributed. When publicly available, RAVEN holders can come to the website, stake their tokens, leave the browser tab open, and earn whenever computations are sent to the device.

Architecture diagram fo the transaction framework inside the Raven Ecosystem.

Some people have called this a SETI@home for AI/ML training and others have called it a replacement to AWS/Google Cloud.

We’re flattered, but we have a lot of work to do. There are 52 stakers who found the Binance Chain test implementation. They account for 67.524% of the entire RAVEN supply staked. Looking further ahead at how this potentially could play out, we feel very lucky to have 18,000+ RAVEN holders. That’s a good start but ideally the number of nodes receiving computations should be in the millions. A massive distribution of stakers is needed to scale.

Thank you for the kind words Kai Ansaari. We love Thorchain too ;)

Thanks so much to all the testers so far. It is great to have real-world tests and to catch edge cases. We fixed and resolved several bugs related to Wallet Connect / Trust Wallet nonce issues, hanging ledgers with the Binance Chain, basic formatting issues, and analytics. We’ve been communicating with many of you directly already, but if you want to get even more involved, please don’t hesitate to reach out on Telegram.

Distribution Framework Testing (Private Beta)

We are working with three pilots and training in our internal private network. This is very important because we need to test the different components and make sure the calculations are being distributed correctly. As you can imagine, working with AI/ML engineers and researchers at the organizations our pilots are running at requires a lot of coordination. Here’s a peak at what that looks like.

Architecture Diagram with different Raven Components

Raven Components

One very important and key thing we figured out during the private beta is how our customers want to interface with our framework. Had we not actually gone and run these pilots, we would not have realized that the most efficient way to achieve adoption into these organizations was through Keras.

Keras FrontendThe Keras Library is used to create deep learning and machine learning models. We use the word Frontend here not because it’s a website, but because it’s an abstraction layer. We use Keras modular and functional APIs to create models. It makes creating layers for neural networks and building complex architectures easier. You won’t need to know whether you’re using a Raven Backend, TensorFlow Backend, or Theano Backend. Obviously if you want the speed, you should be using Raven Backend. Language: Python

Raven Backend—The Raven Backend is a Keras backend which will be responsible for training the models and scheduling the calculations. Again, the beauty of Keras is that it handles the backend problem in a modular way. You do not need to pick one single tensor library and tie that library to the implementation of Keras. Several different backend engines can be plugged seamlessly into Keras. You can probably see why this is important for adoption of Raven. We want to plug right into an AI/ML engineer’s workflow without having the difficulty of learning a new framework.
Language: Python

RavOpJS Engine — This is a JavaScript engine which will is responsible for doing the mathematical operations like matrix multiplication, partial differentiation and others.
Language: JavaScript

Raven Server — This is a socket server and handles the communication between the Raven Backend and RavOpJS Engine.
Language: Python

Kickstarted Marketing Efforts on LinkedIn

We mentioned that we are hiring Business Development Managers. That is because we kickstarted marketing efforts on our LinkedIn page. As we plan our go-to-market strategy, we realized LinkedIn will give us a direct channel to prospects at various organizations. Thus, we decided to test this hypothesis and the results are promising. The LinkedIn page grew from 137 to over 600 followers and we are in direct contact with AI/ML engineers.

We quickly grew from 137 to 611 from our marketing test.

Don’t forget to give us a follow!

Exchange Listings

We are always continuing to explore ways to increase the liquidity of RAVEN. Thus, we successfully listed on our first centralized exchange Bidesk.com.

We listed on our first centralized exchange, Bidesk.com and it was a major success.

Community

The Raven Community is still strong during these hard times! Here’s some love from them in Q1 🙏🙏🙏

CryptoDiffer AMA
CryptoDiffer AMA
People who understand AI technology really love what w’re working on.
Great to have conversations with how gaming companies are using AI
Another big player in the AI space, Ocean Protocol, excited about the progress of Raven!
Generation Crypto named Raven one of the Top 10 Promising Projects
One of the rising stars of CT (Crypto Twitter), Kyle MacLeanX, has $RAVEN in his bags.
Another CT rising star giving us love over a glass of wine.
RAVEN is on Binance Chain, but the quality of developers on Ethereum is staggeringly high. Maybe we need an ERC-20 bridge ;)
Thanks to CryptoDiffer for the analysis.
Kumar understands well that we have a long way to go!
Thank you Christian. The crypto community loves your blockchain analysis!
Thank you Christian. The crypto community loves your blockchain analysis!
Crypto YouTuber, Rey Santos, shows you how to buy RAVEN on Trust Wallet.
Receiving advice from the Matic team in Bangalore, India!

Raven Protocol: Q2 2019 Tech and Community Update:
https://medium.com/ravenprotocol/q2-2019-tech-and-community-update-4f836a9a1e97

Raven Protocol: Q3 2019 Tech project development Update:
https://medium.com/ravenprotocol/tldr-raven-stayed-heads-down-building-in-q3-2019-ae5f242dc15d

Raven Protocol: Q4 2019 Tech project development Update:
https://medium.com/ravenprotocol/happy-2020-an-important-update-from-the-raven-team-67b5e88e1b1d

Raven Protocol Project Review:
https://cryptocalibur.com/portfolio-item/raven-protocol-review

Raven Protocol White Paper:
https://drive.google.com/file/d/1FAaVKkg_CjxMj-n1yHZc6ufcVDtOU1Ct/view?usp=sharing

OFFICIAL CHANNELS:
Official Email Address: founders@ravenprotocol.com
Official Website Link: http://www.RavenProtocol.com
Official Announcement Channel: https://t.me/raven_announcements
Official Telegram Group: https://t.me/ravenprotocol
Official Twitter: https://twitter.com/raven_protocol
Official Medium: https://medium.com/ravenprotocol
Official LinkedIn: https://linkedin.com/company/ravenprotocol

--

--

Raven Protocol
RavenProtocol

www.RavenProtocol.com is a decentralized and distributed deep-learning training protocol. Providing cost-efficient and faster training of deep neural networks.