Meet me at the Bottom

How deep is the ocean of our thinking? As our knowledge of science and reality grows there are two phenomena that seemingly contradict each other. Firstly the structures to be studied become deeper and more complex,at each step new causes of deeper observed consequences have to be described and new linguistic structures have to be invented to describe the deeper structural properties e.g. after the hypothesis that everything in nature is build by atoms was dropped we have seen,electrons,neutrons,protons,fermions,bosons,quarks,gluons, neutrinos and so on, invade the picture.

The extended linguistic complexity is also necessary to expound the more complex mathematical model.

Secondly,when comparing the theoretical models to reality the structures needed had to be more and more chaotical ,e.g. from Euclidean Geometry to Riemannian and manifolds and beyond ,with less and less restrictions on the notion of space or matter. Both phenomena are not contradicting in fact they are aspects of the same property of the process of the deepening of scientific research.

In Mathematics it is usual to have a vertical structure in the development,each time new theory is build upon the older one ,not replacing the old one but adding to it.

That is why students forgetting what they learned a few months ago can never understand mathematics past some elementary level.

Always the linguistic development has to follow the deepening but there it is more a broadening of the language (compare to the situation where you want to dig a deep hole on the beach or in your garden,then it is necessary to make the hole broader in order to go deeper!). To know a language rather well you need at least 30.000 words but some people get along with just 10.000, so the linguistic capacity is personal and it is essential for the level of ideas you are able to express with enough fine- structure .It is only a guess but at the actual speed of the growth of science it may be necessary to add a thousand new words in the description of any scientific domain every five years so that would be 50.000 in 250 years ,a very substantial development in linguistic ability for people,also taking into account that the social development and the changes in society administration and applied technology also require an extension of the vocabulary and the understanding of more complex situations.

There is probably a limit to the capacity of the brain,however you try to shorten the tower of definitions necessary by replacing a part of the tower by one new definition, it remains necessary to understand all the steps in the construction in order to understand the final construction .

This phenomenon is already met in computer programming,the software engineers are unable to understand the whole existing programming software so they invented retro-engineering of the software, this is then a program they do not really know but it(seems) to do everything they want experimentally.

This type of retro-engineering is not acceptable in logic or mathematics,hence also not in physics ,so we inevitably end up with smaller and smaller groups of specialists really understanding the newest developments .There are also more and more scientists which act like retro-engineers of their domain, they use existing theory as given facts but do not necessarily understand it in full detail (and never check all the detail). I once posted about the limits of knowledge and linguistic expression possibilities . That limit is drawing closer ! It is not an accident that it was recently discovered that a large percentage of scientific papers in experimental sciences are wrong,even fraud,and deal with irreproducible results.Except for the fraud cases ,prompted by the publication stress ,most of the wrong results seem to stem from the lack of understanding of the theory of Statistics leading to wrong applications disregarding some theoretical conditions necessary for correct application and drawing correct conclusions from the tests. This is supported also by on-line statistics-apps that are used in all situations irrespective whether the experiment actually fits in the framework of the available automatic calculations or not.

Once the group of people that understand every ingredient of their own field of specialization is too small,science will be on a crossroads and in danger of becoming pseudo-science.

Thinking about this makes me worried ,not that computers will replace us because they cannot be creative at a deep level (for the moment at no level even),but when all science becomes ill-founded ,or when correct phrasing of problems and solutions is even impossible for linguistic complexity , what can we do?

It is possible to start new areas but these may not have any bearing on our knowledge about reality,moreover in certain disciplines,say medicine that may be completely useless.At this level a broadening at the basis does not lead to a deepening of the ocean of knowledge ,when the limit of our capacity is reached we hit rock bottom .

Or is this unavoidable saturation not so bad at all ?

Will science then retreat from the deep areas it dealt with before?It may restart some new areas at a more elementary level but also in this broadening there is a limit of capacity and probably long before that limit is reached there is another limit of interest and usefulness. Perhaps there are other domains of creativity that will flourish,we can see now already some new art-forms rising from the use of 3D-printers and holograms . Also the extended linguistic ability may yield a strong development in literature and poetry ,new domains like science-poetry may arise! So finally we may,if we exist long enough never know everything but we can know exactly as much as we can, but live happily more creative lifes.

So…meet me at the bottom!