My new conjecture in P vs. NP

calhoun137
5 min readMar 25, 2020

This is the third article in a series based on my original work on the theory of self-reproducing machines. In the first article I laid out the theory of bio-programs, the previous article dealt with the subject of bio-computers, and in this article I will explain the relationship between biocomputers and turing machines, how it led me to a new conjecture in the P vs. NP problem, what this conjecture is, and what it suggests for the future of mathematics

The Conjecture

Roughly speaking my conjecture can be stated in the following way: it’s not possible to solve NP-complete problems in P time using turing machines but this is possible using self reproducing machines, that the theory of self-reproducing machines is a more general theory of computation than turing machines, and contains turing machines as a special case.

Statement of the P vs. NP problem

Let’s briefly review the statement of the P vs. NP problem. Roughly speaking, there exists a class of problems called “NP-complete” which share a certain property. If it were possible to solve any one problem in this class with a method which completes in a number of steps which is given by a polynomial function (“completes in P time”), then it would be possible to use this method to solve any other problem in the class NP-complete in P time as well.

The P vs. NP problem is to determine if there exists problems whose solutions can be checked in P time but which cannot be solved in P time without knowing the solution before hand. This last point about “knowing the solution before hand” is the key insight which led to my conjecture.

Considering that neural networks “learn” what the answer to the problem is, and are “trained” to basically create a very complicated lookup table, and that experimentally at least (and heuristically) they are “able to solve NP problems in P time”, it suggests a plan of attack.

There is a natural relationship between turing machines and a more general theory of computation (analogous to the computational power of the human nervous system, which requires the mathematics of self reproducing machines first developed by John Von Neumann) where the atomic elements are self reproducing machines, but which possess the same turing complete properties that are so desirable about turing machines.

Neural Networks as a Power Series Expansion

Roughly speaking, a neural network is equivalent to a matrix power series expansion in a ridiculously high dimensional space where the coefficients in this expansions correspond to the weights of the nodes. This type of guiding principal first came to my attention when I read the paper by G. Cybenko on the theory of the threshold function.

The General Linear group of dimension N is defined as the set of all N x N matrices with non-zero determinant. I have some other conjectures about these types of mathematical systems where the matrices in the expansion are not any matrix but a specific subset of the general linear group in dimension N. These conjectures relate to the way the neural network is trained, and consist of a series of statements about the relationship between an abstract subset of the general linear group and what the neural net is doing under the hood to roughly speaking “create a lookup table in the space of monster group power series”. I have been having a hard time giving meaningful definitions to the intuitive ideas I have which suggest there might be something here.

In any case, these conjectures are not relevant for the present discussion. The main purpose for discussing this aspect of neural network theory is to reveal the relationship between the mathematics of neural networks and the more classical parts of mathematics related to the theory of computation and turing machines, and to suggest possible paths to attempt to prove or disprove the conjecture.

The mathematics of self reproducing machines

The book “Theory of Self-Reproducing Machines” by John Von Neumann costs over $700, and I have had the opportunity to read it and found it disappointing to be honest. What disappointed me is I was completely unable to grok this theory and it went way over my head. He uses such an obscure notation and the book consists of a series of papers, lecture notes, and things of that nature that are not well organized. I need to find another copy and read this book again now that I have a much firmer grasp of this theory. I believe it’s highly likely this book contains results which are relevant for an attempt to prove my conjecture.

This conjecture is the result of many years of thinking about the P vs. NP problem, but this subject is so technically difficult that I hesitate to provide more details about the type of progress I have made on this question, because it is not much and possibly all wrong.

My intuition says that if it was possible to show that turing machines are a special case of a more general theory of computation which is the theory of self reproducing machines itself, then it might be a foothold to begin making progress against this difficult problem

Implications for Science

The invention of the modern computer and the internet has revolutionized modern society in ways that were previously thought impossible. In many ways we owe the invention of the modern computer to Von Neumann, after whom the architecture of the modern computer is named.

In the 20th century, science was divided into many fields, and a brand new field was born, computer science. In my next article I will describe how the theory of self-reproducing machines has the potential to re-unify all branches of modern science in ways that were previously unimaginable.

tl;dr

since neural nets learn and basically create a lookup table, they form the basis for a generalization of the theory of computation based on turing machines which suggests certain relationships between abstract concepts in this theory that lead to a new conjecture

--

--