Physics and Worldviews

Physics is the scientific discipline that studies the physical universe. The subject matter of physics is necessarily defined in a such a broad way, because it is the most fundamental of the natural sciences. Another way of stating what physics is about is that it is the study of matter and energy. These two concepts are basic to all of physics. Essentially, they each reduce to nothing else. There is, however, the notion that ultimately, matter is a form of energy, as well.When one gets down to the level of the quantum world, which is the subject matter for quantum mechanics specifically, the foundations of matter and energy appear very weird and complex. It is said that no one understands quantum mechanics, not even those who devote a career and lifetime to studying it. Of course, this is somewhat of an exaggerated claim.
There is a vast deal of knowledge in quantum mechanics. Yet the point remains that the quantum world is full of seeming contradictions and paradoxes that do not follow classical logic. One famous aspect of the quantum is that light can be either considered a wave or a particle, depending on the context, perceiver, purpose of the experiment, etc. And each are equally valid according to these varying criteria. Not usually is it the case, at least not in science, that something can be considered to be two totally different things at once. One can give many, indeed infinitely, different descriptions of any given object or process, but the quantum goes beyond that, for we genuinely cannot say that only the wave or particle view is correct. They should be opposing and exclusive views, but they are both correct.
In science and many other areas of human experience, there are usually competing theories about things. This is as it should be. The world is a complex place filled with dizzying variety. Yet in science there exists a process of what is called competitive hypothesis testing. Different hypotheses or theories should make different predictions about what should happen in a given circumstance. One then goes on to conduct an experiment, or some other type of research (quasi-experiment, naturalistic study, etc.), that should gather empirical evidence about whether or not that given prediction actually happens in the given case. If it does, one gains confidence in the theory. If not, one loses confidence.
No experiment or study can prove a whole theory wrong outright. Usually, if something does not turn out as predicted, there may be multiple causes of this besides the theory being wrong. The theory could be wrong of course, in whole or in part, but also maybe the study itself was flawed, which could be in any number of ways. Experimental and research design is (or should be) a dynamic, creative, and yet highly meticulous and cautious enterprise that is constantly aimed at avoiding a host of potential biases, counfounds, or other general mistakes that could pollute the study, not actually properly manipulating (or observing) the true variables of interest. Furthermore, even if there are two competing theories, they often might make the same predictions about something. Therefore, testing this prediction wouldn’t help one differentiate between the two theories. One must find a place where the two theories make differing predictions. This is where one can gain evidence that will simultaneously support one theory over the other. Yet theory building is a constant and ongoing process. So even if one piece of evidence supports one theory while goes against the predictions of another, the theorist in the latter can always modify the theory in order to explain why this happened. Of course, this is a process of retrospection, but the modification of theory should be beneficial if it now properly covers empirical evidence that has been gathered, whereas beforehand it failed to properly predict that outcome.
People usually think that a good theory will be one that can explain everything. But oftentimes this is not true. Actually, theories that can explain everything, or at least a very wide variety of different data, are weak for that very reason. They may be so broad and vague that one can make an interpretation of any finding that fits with the theory. This makes the theory unfalsifiable. This is an idea that was made famous by philosopher of science of Karl Popper. Popper proposed that falsifiability was the main criterion of a good scientific theory. There are, of course, many other competing philosophies of science than Popper’s, but many have felt his idea to be an important one, and it has been immensely influential. A good theory should be one that you can prove wrong. To do this, the theory should make specific predictions about what should happen. Again, we see the importance of the ability of a theory to make predictions. If it doesn’t make predictions about what will happen, or what does happen, where one can go and actually test this, how can one ever prove a theory wrong? Stating this in terms of “proving a theory wrong”, again, doesn’t necessarily mean that a whole theory will be brought down by one study or experiment.
Usually theories are whole bodies of concepts and hypotheses that make predictions about different aspects of things. Yet if a theory is a very specific and narrow one, which makes a very clear prediction about what should happen in very particular instance, then it can essentially be “proven wrong” for all practical purposes. Any revision of of it would be basically making a new theory. Yet in the case of quantum mechanics, where there is a distinction between the wave and particle theories of light, different predictions arise from each, both of which are true. This creates a situation that is a scientific and theoretical anomaly.
Yet the weirdness of quantum world goes away as one rises up from the level of the micro-world to the level or macro-phenomena. These are our everyday objects, from our bodies to animals and plants, to all of our normal human artifacts that populate the world. The physics of this world is a Newtonian one. Everyone knows the story of Isaac Newton and the falling apple. This symbolizes the discovery of a whole set of physical laws by Newton that explained everything from an apple falling to why the Moon orbited the Earth in the way that it did. Newton’s laws of motions were basic, fundamental and synthesized and explained a wide range of phenomena. Newton is considered to be one of the, if not the, greatest physicist that has ever lived. His findings have led to a host of scientific understanding and technologies that have transformed our modern world. And many others, like Maxwell, who worked out the principles of electromagnetism in the 19th century, built upon the knowledge of Newtonian physics to understand our world in a way that pushed forward the industrial revolution and led to the domestication of electricity, which has lit up our homes and the world and brought a bounty of technological innovation that no one could have dreamed of just a hundred or two hundred years ago.
The principles of these early physicists are taught worldwide today and every scientist, engineer, and student in general will be exposed to them. They are foundational to the understanding of our physical universe and everything in it. Yet the 20th century brought with it some surprising changes in the world of physics. Two streams of intellectual discovery led to new understandings that were equally as important as these early pioneers. The first was in the world of he micro, as mentioned above. Scientists like Heisenberg and Planck worked out the understandings of the sub-atomic, quantum world that led to all of the weird realizations that were discussed earlier. The idea of the atom was postulated earlier in time, but the investigation of the real constituents and dynamics of atomic physics and quantum mechanics brought both great progress in understanding and also great confusion, both intellectual and sociopolitical. The latter confusion derived from the fact that much of this new scientific knowledge formed the basis of atomic and nuclear weaponry, which mixed with the chaotic world of 20th century global politics, forming a volatile situation that led to much fear, death, and destruction.
This is a world that, although more peaceful and less precarious since the end of the Cold War, we still live with today in the form of worrying about nuclear weapons in the context of unstable or hostile politic regimes.The second stream was the one made famous by Einstein. His theories of general and special relativity were a re-evaluation of the macro level that was almost equally as weird as the new findings on the micro-scale. Indeed, when talking about the randomness of quantum mechanics and the new weird view of the universe that came with it, Einstein made his famous statement that “God does not play dice.” Einstein proposed that space and time, previously two separate variables in classical Newtonian physics, were actually part of a single object or process, known as the space-time continuum. Einstein’s realizations were attributed to his sheer genius, and he became, not only one of the most famous scientists, but one of the most famous people in history, because of them.
Interestingly, research after Einstein’s death, which has examined his brain, have found areas that are actually smaller than usual. This is a sort of counter intuitive finding, considering one might guess that bigger is better, and usually it is. But in this case, some smaller areas of the brain seemed to open up possibilities for Einstein that aren’t easily accessible for the rest of us. Although this is still quite speculative, and a whole other topic at that. Nevertheless, Einstein’s ideas brought with them many implications for physics, and for cosmology and astronomy especially, that is, the large scale study of the universe, or cosmos. This latter branch of physics is one that we have not yet discussed, but it has arguably been wondered about by common people all throughout history more than any other area. Nearly every person who has ever lived has gazed at the night sky and experienced some sense of awe and wonder, however small.
All of the great civilizations of human history have had some system of knowledge about the astronomical world. Especially in a pre-industrial, pre-electronic world, the night sky was a source of knowledge that was changing yet consistent. Therefore, it lent itself to much data collecting and theorizing. As far as the Western world is concerned, the reigning paradigm of astronomical knowledge for most of the ancient and medieval periods was based on the work of the Greek astronomer Ptolemy. The Ptolemaic model posited that the Earth was at the center of the universe, with the sun, moon, and planets of the solar system orbiting around it. This was therefore known as a geocentric model, that is, the Earth is the center. The Polish astronomer and Catholic cleric Nicolas Copernicus came along in the early modern period, and, drawing from others’ work, made the shocking, and blasphemous, conclusion that the Earth was not at the center of the universe. Copernicus was careful not to publish or state his views in public during his lifetime, but the word was out, and further developments proved this to be true. The Ptolemaic, geocentric model was official dogma of the Catholic church and much was made about this new theory. But in the end the evidence held sway, and the heliocentric model, with the Sun at the center, came to be accepted.
Famous in the early modern era of science and physics is of course, Galileo, who was put on house arrest by the Catholic Church for his discoveries and claims concerning the new view of universe that was arising. The Church officially apologized a few hundred years later and admitted that they were wrong. Too little, too late, perhaps. So, in the end, the Ptolemaic, geocentric model, was replaced with the Copernican, heliocentric model. Later, with technologies in telescopy allowed astronomers to view parts of the universe never before imagined. With developments in physics coming from other figures like Newton and Einstein, and those working in their wake, cosmologists and astronomers used this knowledge to advance their understanding of the structure and history of the universe.
In the early 20th century, a Belgian Catholic priest and physicist made the claim that the universe began with a so-called Big Bang, a view that has come to be accepted, and his essentially a household term. This is now fundamental to the scientific view of the universe. The Big Bang theory caused quite a stir. Einstein realized that his work had the implications that the universe had a beginning, but he didn’t like this idea. He preferred the older view of an infinite universe with no beginning. One part of that, besides the theological implications, was that the universe with a beginning was likely to be a universe that will have an end.
Several theories today posit that the universe will end in some sort of catastrophic scenario. Either there will be a a big crunch, where everything will pull together and the universe will essential self-implode. Or there will be a big freeze, where everything will separate further and further and there will be less and less thermodynamic activity, hence there will be no movement and hence no heat, at all. In this latter scenario everything will reach absolute zero temperature and universe will freeze. None of these types of scenarios sound attractive, even though they are posited to happen so far in the future as to be of little practical worry at the moment. Others believe the universe will reach some sort of equilibrium or at least fluctuate without some more moderate bounds than these scenarios describe. In any case, the new view of the history of the universe is quite different than many of the older views. The Big Bang theory has actually excited many theists, striking them as direct scientific evidence that the universe was created by God at a quantifiable time in the past. Others disagree with this inference. The ancient debates rage on.