Nature vs Von Neumann vs Neural networks

Neural Quine: Is Self-Replicating AI Real?

A story on reproduction in biology, machines, and AI

Alexandr Honchar
Nov 25, 2019 · 12 min read
Image for post
Image for post
https://en.wikipedia.org/wiki/Self-replicating_machine

Self-replication in nature

Image for post
Image for post
DNA replication http://jonlieffmd.com/blog/dna-proofreading-correcting-mutations-during-replication-cellullar-self-directed-engineering

Self-replicating machines

Rather scary mechanical self-reproduction from Cornell University
Image for post
Image for post
Von Neumann’s Theory of Self-Reproducing Automata: A Useful Framework for Biosemiotics?, Dennis P. Waters
Image for post
Image for post
Von Neumann’s Theory of Self-Reproducing Automata: A Useful Framework for Biosemiotics?, Dennis P. Waters

Why do we need self-reproducing AI?

Quines

s = ’s = %r\nprint(s%%s)’
print(s%s)

Neural Quines

Image for post
Image for post
The architecture of the vanilla neural quine
Image for post
Image for post
The architecture of the auxiliary task neural quine
Image for post
Image for post
Weights reproduction loss for the regular neural quine
Image for post
Image for post
Total loss for the auxiliary neural quine
AuxiliaryQuine(
(embedding): Embedding(4096, 64)
(linear): Linear(in_features=64, out_features=64, bias=False)
(output): Linear(in_features=64, out_features=1, bias=False)
(class_output): Linear(in_features=64, out_features=10, bias=False)
)

Experiments

Vanilla neural quine

VanillaQuine(
(embedding): Embedding(10000, 100)
(linear): Linear(in_features=100, out_features=100, bias=False)
(output): Linear(in_features=100, out_features=1, bias=False)
)
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Gradient-based training. First image: distributions of real neural network weights and the predicted ones; second and third images: weights matrices of real and predicted weights respectively visualized
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Regeneration training. First image: distributions of real neural network weights and the predicted ones; second and third images: weights matrices of real and predicted weights respectively visualized

Auxiliary quine

AuxiliaryQuine(
(embedding): Embedding(4096, 64)
(linear): Linear(in_features=64, out_features=64, bias=False)
(output): Linear(in_features=64, out_features=1, bias=False)
(class_output): Linear(in_features=64, out_features=10, bias=False)
)
Image for post
Image for post
Typical convergence plot for training the auxiliary neural quine: the classification loss is jumping several times up to give up for the self-replicating loss
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Weights prediction lambda 1.0, MNIST classification lambda 1.0. First image: distributions of real neural network weights and the predicted ones; second and third images: weights matrices of real and predicted weights respectively visualized
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Weights prediction lambda 1.0, MNIST classification lambda 1000.0. First image: distributions of real neural network weights and the predicted ones; second and third images: weights matrices of real and predicted weights respectively visualized
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Weights prediction lambda 1.0, MNIST classification lambda 0.0001. First image: distributions of real neural network weights and the predicted ones; second and third images: weights matrices of real and predicted weights respectively visualized

Solo classification

Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
The first picture shows regular convergence of the MLP for classification, the second picture shows a histogram of weights magnitudes and the last one stands for the weights matrix itself

Conclusions

Cantor’s Paradise

Medium’s #1 Math Publication!

Alexandr Honchar

Written by

Co-founder of consulting firm Neurons Lab and advisor to AI products builders. On Medium, I write about proven strategies for achieving ML technology leadership

Cantor’s Paradise

Medium’s #1 Math Publication

Alexandr Honchar

Written by

Co-founder of consulting firm Neurons Lab and advisor to AI products builders. On Medium, I write about proven strategies for achieving ML technology leadership

Cantor’s Paradise

Medium’s #1 Math Publication

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store