Concurrent Training of a Requester Graph Across Multiple Participating Providers

Raven Protocol
RavenProtocol
Published in
2 min readJun 13, 2023

๐Ÿš€ Release Notes ๐Ÿš€

We are excited to announce the next version of Ravenverse which supports concurrent training of a Requester Graph across multiple participating Providers.

Requester Side

1. The number of participants required to compute a graph is now determined and set automatically by our backend based on the complexity of the deep learning graph, operations, weight size and a bunch of other parameters. Requesters can no longer set this number manually in the ravop.execute() function.

2. Example scripts based on fine-tuning of GPT-2 model has been added in the Ravenverse GitHub Repository. The Requesters can first generate a โ€œ.ptโ€ (pytorch) model file and there and load it up for distributed training in the Ravenverse. As an example, we have added a Poem Generator GPT which can be trained to write poems based on different emotions like fear, anticipation, mystery, horror etc. Additionally, thereโ€™s a Number-Sorting application using GPT that takes in an input vector and simply sorts it.

3. Novel optimisation technique implemented for subgraph formation. Significant speedup observed in graph compilation time.

Provider Side

4. Improved ping-pong refresh rate for more rapid assignment of subgraphs to providers.

5. New subgraph computation mechanism with reduced payload size for more efficient transfer of results. Massive speedup observed.

6. Backup provider support for maintaining graph progress in case of disconnect cases.

7. GPU benchmarking metrics revised based on graph complexity.

Our Libraries:

Ravpy: https://pypi.org/project/ravpy/
Ravop: https://pypi.org/project/ravop/
RavDL: https://pypi.org/project/ravdl/

You can find the documentation for each library in the respective repo Readme files. Please try them out and let us know if you run into any problems.

Raven Protocol GitHub: https://github.com/ravenprotocol

Enjoy the new release! โค๏ธ

โ€” The Raven Protocol Team

OFFICIAL CHANNELS:
Official Email Address: founders@ravenprotocol.com
Official Website Link: http://www.RavenProtocol.com
Official Announcement Channel: https://t.me/raven_announcements
Official Telegram Group: https://t.me/ravenprotocol
Official Twitter: https://twitter.com/raven_protocol
Official Medium: https://medium.com/ravenprotocol
Official LinkedIn: https://linkedin.com/company/ravenprotocol
Official Github: https://www.github.com/ravenprotocol
Official Substack: https://ravenprotocol.substack.com
Official Discord: https://discord.gg/Njq8QxYUKR

--

--

Raven Protocol
RavenProtocol

www.RavenProtocol.com is a decentralized and distributed deep-learning training protocol. Providing cost-efficient and faster training of deep neural networks.