Tackling Cosmological Constant Problem with a new approach
The cosmological constant remains a thorn in the side of physicists. Described as the worst theoretical prediction in the history of physics, new approaches are clearly needed to settle this cosmic headache.
Albert Einstein wasn’t known for his mistakes, but it was for good reason, that history's most famous and familiar scientist referred to the cosmological constant as his ‘greatest blunder.’ And it represents a problem that is yet to be solved.
In the present day, the cosmological constant represents the simplest explanation for ‘dark energy’ — 70% of the energy content of the Universe, driving its accelerating expansion. Thus, it functions as a counter-point to gravity — driving matter apart rather than drawing it together.
A dispute rages between scientists with regards to the value of the cosmological constant — represented in equations by the Greek character Lambda —and scientists are approaching the equations of Einstein’s general relativity from various angles to crack this cosmic conundrum.
The cosmological constant problem also succinctly represents the divide in physics between the theoretical and the experimental as theoretical predictions of its value are vastly greater than the values obtained through observation. This seems to be as a result of the contribution from ‘empty space’ — vacuum energy arises from particles and antiparticles popping in and out of existence in the ‘void’ of space.
Enter University of Geneva assistant Professor Lucas Lombriser, who has a novel new approach to this problem.
Lombriser suggests that the issue with the cosmological constant may lie in the fact that another constant in physics — Newton’s gravitational constant or ‘big G’ — isn’t a constant at all. It may actually be a variable — capable of taking various values, but only in different Universes. In other words, it could take different values, but in our Universe, it always takes the value
6.673 x 10 -¹¹ Nm² kg-².
“Attempts to explain the cosmological constant have failed and there seems to be something fundamental that we are missing from our understanding of the cosmos,” says Lombriser explaining his new model detailed in a paper published in the journal Physical Letters B.
“I became increasingly unsatisfied with some of the main candidates that have been proposed to solve the problem.”
The possibility exists that the unification of the standard model of particle physics, general relativity and quantum physics yielding a quantum theory of gravity could resolve the cosmological constant issue, but Lombriser believes that prescribing a precise value to vacuum energy lies elsewhere.
“I feel we may have been concentrating too much on this picture,” he tells me. “A different approach is needed to solve this problem.”
This led the researcher to target ‘big G’ as a possible solution to the problem with the cosmological constant. An unusual choice, given its prolific use in physics. The new mechanism Lombriser puts forward makes the same predictions of Einstein’s general relativity and the standard theory of gravity but does so whilst providing a prediction for the cosmological constant that matches observations.
The narrowing of this gap results from altering ‘big G’ to a variable, thus eliminating the contribution that vacuum energy makes to the cosmological constant. An approach that hasn’t been tried before in the long history of attempting to resolve the problem of the cosmological constant.
The cosmological constant — a history of controversy
The problem of the cosmological constant can be divided into two distinct eras, Lombriser tells me — old aspects and new.
Einstein initially introduced the constant into his field equations of general relativity to offset the fact that said equations predicted the Universe was expanding. It was scientific consensus in 1917, as general relativity was completed, that the Universe was static. A position that Edwin Hubble would make obsolete over the next decade or so.
When Hubble showed Einstein solid evidence from his observations that the Universe was in fact expanding, the physicist realised his mistake. His initial field equations had been correct in their prediction at it was his ‘fudge factor’ — the cosmological constant — that was erroneous.
In 1931, he removed the cosmological constant from his equations, referring to it famously as his ‘greatest blunder’. But, science was not through with Lambda.
The cosmological constant’s entry into a new era occurred when cosmologists called upon Lambda again in 1998 when it was discovered that the universe was not just expanding, but was doing so at an accelerating rate. This time the cosmological constant was employed to account for that acceleration.
But that is not an end to the stress that the cosmological constant was to cause scientists. As mentioned above, the theoretical value for the cosmological constant predicted by quantum field theory is enormous in comparison to the experimental value obtained by observations of supernovae and the cosmic background radiation — the leftover of an event called ‘the last scattering’ in the early Universe.
“The discrepancy between theory and experiment amounts to a staggering factor of 10¹²¹ [1 followed 121 zeroes],” Lombriser points out.
“It has been referred to as ‘the worst theoretical prediction in the history of physics’.”
The physicist adds that such a large contribution from vacuum energy probably would not have allowed the Universe to form in the first place. This, of course, means something must be missing. A fact which excites and inspires researchers from across many different fields of science, including Lombriser.
Could Newton’s Gravitational constant — ‘Big G’ — be the solution?
Lombriser’s novel approach to solving the issue of the cosmological constant involves setting his sights on another constant in physics, one which has a longer history and has caused less controversy — ‘Big G’.
His method is related to models put forward by other researchers which have focused on other elements of Einstein’s field equations, changing them from constants to variables and a ‘sequestering’ mechanism put forward by researchers Nemanja Kaloper and Antonio Padilla half a decade ago.
“My paper doesn’t really change the equations of general relativity,” Lombriser says. “But adds an additional equation on top of Einstein’s field equation.”
Lombriser explains that the beauty of the mechanism he advances is that unlike other models that have preceded it, it forms an extension that provides a prediction for the cosmological constant that is close to observations and experimental values.
“The value of the cosmological constant is given by the formation of structure like galaxy structures in the matter distribution of the Cosmos,” Lombriser explains of his model. “This can be thought of a backreaction effect of ‘matter clumps’ forming on smaller length scales upon the dynamics of the Universe at very large scales.”
This backreaction mechanism would prevent the vacuum gravitating and thus eliminates its contribution to the theoretical value of the cosmological constant, Lombriser says.
“Vacuum energy isn’t directly contributing to the dynamics of gravity,” he adds. “Rather, the cosmological constant that influences dynamics is forced to correspond to an average of the matter content in our Universe.”
Part of the beauty of Lombriser’s take is its simplicity. “The ingredients are remarkably simple,” he says.
“There are no new ingredients expect for a global variation of Newton’s gravitational constant in the standard equations of gravity.”
When evaluating the new equation, Lombriser says, he is led to a prediction for the value for the cosmological constant, which turns out to be in good agreement with what we observe. Rendering the quantum contribution of vacuum energy gravitationally inert, fixes the issues with the cosmological constant, but is this reason enough to tinker with the equations of general relativity?
If it works, why fix it? Because there’s little left to fix.
The decision to focus on Newton’s gravitational constant was anything but arbitrary for the researcher. Lombriser had first approached other avenues to assess the expansion of the Universe before tinkering with G. “I worked on modifications of gravity as the cause of the cosmic expansion,” he explains.
“I became slowly unsatisfied with this approach as I felt didn’t address the core problem of cosmic acceleration.”
The measurement of the speed of gravitational waves also ruled out many of these alternative gravity models which had been proposed as the cause of accelerating cosmic expansion, Lombriser tells me.
Lombriser adds that despite finding Kaloper and Padilla’s approach appealing, it doesn’t explain why the experimental value of the cosmological constant is so small. He adds that in the original version of their sequestering model, the Universe should have ground to a halt billions of years ago to result in a neat fit for the value of the cosmological constant.
“What I needed to make a prediction compatible with observations, is a model of different cosmic evolution,” he continues. “One that would end in the future, but also provide the correct value of the cosmological constant.”
Lombriser could have done this by introducing new fields permeating the Universe, or to modify gravity. Justifying why he rejected these approaches, he says: “The results I got weren’t very promising.”
“Also, the future evolution of the Universe looked arbitrary, there was not a natural way that the feasible models would provide the required end of the cosmos.”
What Lombiser realised was the way that structure forms in the matter distribution of the Universe, in particular with regards the formation of galactic clusters, would provide his model with the collapse event it would to arrive at a natural prediction for the cosmological constant.
“While I managed to get this idea to work with the sequestering mechanism,” Lombriser explains. “It would still require the formation of structure to stop at a specific time in the future to obtain the observed values of the cosmological constant.”
Investigating just how the sequestering mechanism did its work led Lombriser to the conclusion presented in this paper — both a generalisation and a simplification of the elements that allow the functioning of the mechanism.
So far so good, but the new model isn’t without problems. Namely, can cosmologists verify it experimentally?
Future prospects: A new era of cosmology or another dead end?
I ask Lombriser what the prospect of verifying variability of Newton’s gravitational constant, looks like. “Unfortunately, for the type of variation I propose, this will be a difficult task,” he answers. “As G would not be variable in the observable Universe — it is constant throughout it.”
But that doesn’t mean that the model he puts forward is completely untestable. “The correct question to ask in order to verify my model is to look for the effects these fundamental theories may have that can be tested,” he adds. “Falsification by experiment seems difficult with what can currently be said about this model.”
One of the benefits of this new model is as it reproduces the observable success of standard cosmology, but with no cosmological constant problem and one less free parameter — meaning, Lombriser suggests, that it is more statistically favourable.
The model may also benefit from the fact that the course of science rarely runs smooth, thus meaning other cosmologists may find appealing concepts in Lombriser’s work to build their own models upon.
“Researchers working on new fundamental theories may use this mechanism as a motivation to place a focus on those that can give rise to this mechanism,” he agrees.
“Ideally their theories would also make verifiable and falsifiable predictions.”
Lombriser points out that different theories that produce the mechanism he puts forward need to be explored. One by one these theories can be eliminated, and as these candidates fall away, a solution to the cosmological constant problem could present itself — or, Lombriser admits, his mechanism may be ruled out altogether.
Robert Brandenberger, a theoretical cosmologist and a professor of physics at McGill University in Montreal, Quebec, Canada, believes that Lombriser’s new model certainly has merit.
“I consider the paper to be an important and interesting new idea solve a crucial problem in fundamental physics,” says Brandenberger — who as the co-founder of the theory of string gas cosmology, is no stranger to offering alternative theories in cosmology.
“I’d like to see the work extended to include a study of how the new equations will affect the formation of structure in the Universe,” Brandenberger cautions. “One would need to study the effects on small inhomogeneities which develop into microwave anisotropies [in the CMB] and the large-scale structure in the distribution of galaxies.”
Time will only tell if Lombriser’s theory will bear fruit. If indeed, he has hit on the correct mechanism that will finally untangle the problems with the cosmological constant. Even if this is the case, it may plant seeds for more questions. For instance; if other Universes have different values of ‘big G’ what would those Universes look like?
Or perhaps, should Lombriser’s work be wrong or incomplete, there is a young researcher somewhere, or at some distant time, who will discover his paper and plant a seed of her own.
Theoretical physics, it seems, is a tangled and dense forest indeed.
Special thanks to Lucas Lombriser, Robert Brandenberger and Luc Bourhis.
Original research: https://www.sciencedirect.com/science/article/pii/S0370269319305088