Changing How Decentralized Artificial Intelligence Worked So Far

Mate Labs
RavenProtocol
Published in
5 min readJun 6, 2018

This is a guest post written by Mate Labs who wanted to share with the Raven Community how much the solution is needed.

A few years into the expansion of the global AI ecosystem, general enthusiasm increased over its potential in diverse fields like business, health, finance and other services, which encouraged quite a few ventures to step into the market. Widespread adoption of AI fueled by various open-source frameworks and architectures witnessed multiple use-cases. However, a vast majority of ventures have found themselves stumbling on several roadblocks that prevents them from implementing those developments through AI.

The major stumbling block in implementing AI for most developers in startups is how mundanely time-consuming the process is.

The major stumbling block in implementing AI for most developers in startups is how mundanely time-consuming the process is. ML models take a tedious amount of time to get trained, with calculations that require several GPUs to execute, and still takes weeks and months to complete. The cost involved is yet another factor. Therefore, huge prices are paid to acquire GPUs and storage in the cloud to execute ML training. These roadblocks make many quite apprehensive towards automating their model updates, as the adjacent cost builds up exponentially.

Anyone who looks at general data can understand how much is being spent by startups for AI implementation at present. The need for innovative alternate options led people to the idea of garnering the compute power from idle sources of power such as any unused GPUs, or more specifically, from those who are willing to trade it for crypto tokens on the blockchain.

Using blockchain as a means to establish a secure and worldwide connection for people in all fronts of life and business, has evidently created a ripple effect in the economy. This development has become a game-changer as innovators have come up with technology that can trade resources among specific communities that want to connect with each other.

Sadly, however, none of them have seen true decentralisation in training deep learning models as per definition.

What Does True Decentralisation Entail?

True decentralisation does not happen until the training and implementation of AI models are affordable and efficient for everyone.

The biggest struggle for developers and engineers in developing efficient automation of systems is the expense and time required to train deep learning ML models successfully. For instance, to train a model on ImageNet which has a dataset of 1 million images, each of size 256*256, it takes months to get completed when using just one GPU server. The time required would be reduced to 30–40 hours, when the dataset is split across mini-batches when instead of one, tens of several powerful GPUs are made available.

More than half of the organisations worldwide have confirmed use of or plan to use AI at the core of their strategies in the coming 5 years. These organisations want to make their user experience personalized or seeks to optimize their operations to accelerating their profits. In the effort to maximise the efficiency of the resulting models, more and more datasets are fed in, which incidentally has increased the compute power requirements around the world.

And that is exactly what Raven has found a solution for. Successful trials in using blockchain to facilitate sharing of resources has prompted the creation of a more down-to-earth platform where the compute power of even a normal laptop or smartphone device can be a contributor to training a model anywhere in the world.

Distribution of Deep Learning Training Models and Sharing of latent Compute Resources through the Blockchain

What was once thought to be an unchanging process of strenuous uploading and processing of algorithms that is dependent on considerable availability of resources, is now set to change with blockchain integrated platform at Raven Protocol. The protocol allows anyone, anywhere to participate in contributing the compute power as low as that on their smartphone devices. What makes the system sustainable is the distribution of monetary value getting generated in the process. This is not only for the companies, but also for their consumers.

Giving entrepreneurs, startups, and the companies struggling to compete in the current economy the ability to train their ML models inside the protocol by inputting/sharing the algorithms, datasets and compute power, is structurally what forms Raven’s platform. That people who are not implicitly part of a training programme can also share their resources inside the protocol, which catapults the availability of necessary compute power.

Taking A Nuanced Approach to Decentralisation

Inclusivity has been the target of all other institutions that have tried to bring about a decentralised approach to Machine Learning. The blockchain and AI are a perfect combination and multiple efforts have been made to enhance the experience of adapting AI into all spheres of practical implementation.

Some of their applications are still limited and thus, providing solutions to those limitations can create a more sustainable ecosystem. Raven is, therefore, expanding the scope of the ecosystem to better further the cause of decentralising AI.

Browser Based Application

It brings great convenience for anyone to be able to hop-on / hop-off from the network whenever they feel like contributing.

No Additional Dependency at the Contribution End

Additional dependencies on any set of softwares from the contributor’s can delay support to the community, which instigated the elimination of any such dependencies.

Speedy Training

The use of the above two features in addition to inclusion of increased amounts of nodes, together doles out fast training of deep neural networks.

Dynamic Computational Graph

Dynamic allocation of nodes facilitates ad-hoc hopping on and off of contributor nodes. In short, the network is robust enough to handle any related aberrations.

Javascript Based DL Framework

All of this made possible by a new deep learning framework built on Javascript.

Interestingly, a new revenue opportunity is formed through the system for websites that survive solely on advertisements, furnished from utilising their user’s data. No matter how many of those institutions are called out or put in trial, similar organisations will always be dependent on user-data. Privacy concerns are alarmingly high at the moment, and pure consumer websites are all the more at a dilemma, having left with no other alternative but to use their data to serve them ads. And, how Raven can tackle it can be interesting to watch.

--

--

Mate Labs
RavenProtocol

We’re trying to enable Machine Learning and Deep Learning to one and all. Irrespective of whether a user knows how to code or not.