Decentralized supervised neural network on the Blockchain: Giving mining a good purpose

The value of the gold is backed by the value of the resources needed to extract and transform the raw gold, plus the value of the market. Today Bitcoin and Ethereum are based in the PoW (Proof of Work) schema, this is a system created to emulate the difficulty of extracting physical things with value, like the gold, but this system is not environmental-friendly cause we are wasting a lot of energy to solve some mathematical operations that are not being used for any useful purpose.

There is another task that GPUs are very good doing, that also expends a lot of time and energy but it is much more useful for us than calculating random hashes like in Ethereum or Bitcoin: the optimization of a neural network.

A schematic explanation of the interaction between nodes and the blockchain using the use case of the Celeba50K dataset.

The use case in the blockchain

I am going to speak about a use case where the blockchain technology can add a lot of value to the Artificial Intelligence field.

Different datasets and different neural networks can be trained over the blockchain, to explain the use case I am going to use the Celeba50K dataset, which contains 50.000 different faces of celebrities and a GAN neural network.

Let’s suppose that the nodes are training a neural network and running an IPFS node which stores: the datasets with thousands of images of celebrities, and the checkpoints file that contains the weights, which are the values that the neural network has learned during its training.

The network also can decide to improve a current dataset or to request a new one, so we will end with a decentralized network of knowledge on multiple datasets.

Top 3 Most Popular Ai Articles:

1. Neural networks for algorithmic trading. Multimodal and multitask deep learning
2. Back-Propagation is very simple. Who made it Complicated ?
3. Introducing Ozlo

Let’s see both scenarios, a use case training an existent dataset and another one requesting new datasets from the nodes:

Use case 1: Improving the current dataset

This is the passive way of mining, in this case with the Celeba50K we have a GAN neural network, so each node will try different values for the generator and the discriminator to calculate the weights. In GANs the generator tries to generate different solutions from noise that are similar to the real input, and the discriminator tries to detect if the generated solution is a fake or not.

Some new celebrities faces generated by the GAN. I have used 40 filters for the discriminator and 160 for the generator over the set of 50K images.

In the current scenario, each node receives the images from the Celeba50K dataset and starts the training using the GPU. When training is finished, the node shares the weights with the network, and each node checks how good has been the training compared with others. We can compare how accurate is an output compared with the inputs. The better the training, the more reward.

Use case 2: Creating new datasets for the network

This is the active way of mining, let’s imagine that we want to generate a new dataset, and we want to request the data (images, texts, sounds…) to the network. The nodes will vote for new datasets and the network will seek the consensus to decide which new datasets are accepted. Then the dataset request is broadcasted to the network and the nodes are going to be able to submit new items for the dataset, the accepted submissions will be paid by the network.

The network always will have a decent knowledge, so it will be able to check the submissions with some accuracy, for example if we are requesting "images of bridges", we can easily discriminate fake objects, ban unlegit nodes and promote trustworthy ones.

Mining with Neural Networks, a PoW + PoC schema

So if you want to be a successful node you will need to have a lot of computing power with GPUs (Proof-of-Work), and a lot of free HD space to store the datasets (Proof-of-Capacity).

The dataset sizes to train neural networks are usually VERY large (MIT Places205 contains 2.5 millions of images and its size is 1.6TB), adding some hashing operations to check that the data is really in your computer, can lead to a good way to create a Proof of Capacity (PoC).

This combined schema of PoW + PoC is a new concept of combined mining.

We are on the path for a decentralized superintelligence

I can see a relatively close future where there is a shared network of AI knowledge where everyone can connect to, to feed it with new data, or to use it as a service in any of our home, work or personal gadgets.

Is this the path to the decentralized superintelligence? Should we implement a stop button for a decentralized Skynet? :-)

Let’s see, for now the humans are more excited than scared about AI, and it should always be like this.