Peter van den Engel
2 min readNov 29, 2017

--

Interesting subject/ although it strikes me for lacking a lot of definition.

First of all there is no clear definition of what intelligence is. I would roughly say it is a knowledge superiour to instinct, as in automated response systems. It also does not define the field/ because you or a machine can be very exact in an equation, but how relative is the equation itself? Do you believe being able to play chess improves your living conditions? I do not.

I suppose intelligence tests are calibrated in a certain way, to quickly see references and answer questions/ but how relevant is that when all examples are abstract? I believe the rating system itself is questionable for producing definite conclusions.

A guide for self performing reactions is requirement. Biological beings will have to find their own energy. This is not a requirement for a machine/ so how on earth would you call it more intelligent when the requirement does not apply? It is irrelevant and would not know what to do/ or in other words what should it have to do? with no requirements.

We set our own requirements/ and a machine has none, other than we ask it to do, or solve. I am sure it can get smart in that, which is already more than we can wish for. When our future problems are unknowns, how would a machine know about that? If we cannot even solve the problems that we know now/ but apparently do not understand, cannot even define them, what good would a machine do?

Of course it is very relevant when a machine or a robot would perform simple tasks, which can also be broadend, but that’s it. Obviously since our environment consists of time and space, it would be irrelevant concentrating on coincidence without these requirements. It makes no sence.

--

--