At the conference “Zooming in: The Cosmos at High Resolution” Prof. Matthias Bartelmann from Heidelberg University gave a didactively beautiful presentation about the standard cosmological model for the general public.
Here is his argument why it is believed that this model is an excellent description of the universe:
In 1915 Einstein published his Theory of General Relativity. About two years later, Friedmann simplified the field equations by assuming the universe is homogeneous and isotropic. That is, he assumed the universe looks the same wherever you are (homogeneity) and whichever direction you are looking (isotropy), as long as one does not look at the details. This is quite consistent with our observations today. The resulting Friedmann equations are the very basis of the modern standard cosmological model. Most solution to the Friedmann equations predict an expanding or a contracting universe, depending on some set of initial numbers, such as the total amount of matter in the universe.
In 1925 Edwin Hubble found that distant galaxies expand away from us at an ever larger rate with increasing distance. This implied that only the expanding solutions of Friedmann’s equations were allowed as solutions for our universe.
If the universe is expanding, then it must have had a very dense hot past. This is a similar situation as with gas that does not exchange heat with its surroundings: When this gas is compressed it heats up and it cools down again when it is relaxed. This then lead to the realisation that under the very hot and dense early conditions Helium and Lithium must be produced from the abundant Hydrogen nuclei by very much the same processes that take place in stars, where such conditions are still found today. In fact, about 25 per cent of the matter ought to end up being Helium. This Helium cannot come from stars, as most stars keep their synthesised Helium. The expectation from Big Bang nucleosynthesis that there ought to be 25 per cent Helium has been confirmed (although not perfectly). The hot dense phase also implies that there was a special time, about 400 000 years after the Big Bang, when light stopped interacting with the matter. This happened when the matter cooled down sufficiently to allow the protons to capture electrons such that neutral atoms formed through which the photons could easily pass. This event should then lead to a background radiaton, which we should see today in the form of these same photons being at a very low energy because of the expansion of the universe (the light waves get “stretched” with the expansion). And yes, this background radiation has been found and is called the cosmic microwave background radiation (CMBR).
It also became clear from the observation of the CMBR that at about 400 000 years after the Big Bang the universe was already extremely homogeneous. One place looked just like the other. But, the light could not have travelled more than a distance of 400 000 light years, which was much smaller than the extend of the universe at that time. Therefore, one part of the universe was not able to “know” at which temperature the other part was at. But, given they are so similar, they must have “known” of each other. It is in fact impossible to explain why all parts of the universe have very similar properties if there was no interaction between them. And so Inflation was invented as a hypothetical “trick”, according to which the universe was very much smaller just after the Big Bang such that each part was able to exchange information with every other part, but then explosively increased in size to the time when the photons decoupled from the matter, i.e. when the CMBR became observable.
Later measurements of this CMBR showed that it is indeed amazingly equal in all directions, but that it shows miniscule fluctuations. These fluctuations can be interpreted to be the early seeds from which galaxy clusters and galaxies formed at later times. With this interpetation these fluctuations in the CMBR can only be mapped to the present-day distribution of matter if there is Dark Matter. Without the existence of a new form of matter, which only interacts gravitationally, the normal matter would not be able to fall together to form the galaxies we observe today given the miniscule fluctuations in the observed CMBR. The Dark Matter increases the gravitational force within the over-dense regions and thereby speeds-up structure growth just right. If there were no Dark Matter, then the fluctuations in the CMBR would have to have been many orders of magnitude larger than is actually observed for them to be able to grow to the galaxy clusters and galaxies of the present-day universe.
From Einstein’s field equations it can be derived that light travelling past galaxy clusters is bent. The observed gravitational lensing effect indeed confirms that Dark Matter is needed in addition to the normal matter, in order to explain the observed bending of light using Einstein’s field equations. Also, the amount of Dark Matter needed from lensing is similar to that needed to make the structures grow just enough from the CMBR to account the observed structures in the form of galaxy clusters and galaxies.
The observations of distant supernovae of “type Ia” (most probably exploding white dwarfs) however unexpectadly showed in 1998 – 1999 that in fact the universe was larger, at a given time, than thought. That is, the supernoave are further away. This is only reconciable with the expanding Friedmann model if the universe is forced to expand ever faster, i.e., if it is now just at the start of a new inflationary period. This new inflation is taken to be due to Dark Energy, which is a form of energy of the vacuum such that it is forced to expand ever faster. This assumption is mathemtatically the same as introducing the cosmological constant Lambda (L).
The above is thus a brief description of the contemporary standard cosmological model, also called the concordance cosmological model, or the LambdaCDM (LCDM) model. CDM stands here for “Cold Dark Matter”, which means that the dark matter postulated to exist (see the arguments above) is composed of elementary particles with a large mass such that they have small velocities (they are “cold”) so that they can aggregate to galaxy-sized dark matter “halos” or clumps. The LCDM model can thus be seen to have success for accounting for many observed phenomena, even if not perfectly. Still, cosmologists state that we live in the era of high-precision cosmology because the many parameters that define the LCDM model can be measured precisely with modern methods.
But, it has at least three important problems:
I) It comes at a very heavy price! By insisting that Einstein’s field equations are valid and making some additional assumptions, cosmologists have to postulate the existence of (1) inflation, (2) Dark Matter, (3) Dark Energy. None of these are understood. Together, Dark Matter and Dark Energy amount to about 95 per cent of the energy content of the universe. That is, the mathematical description of the universe, in terms of Einstein’s equations, implies that this mathematical modell is based nearly exclusively only on stuff we have absolutely no clue about. Imagine an astrophysicist trying to publish a research paper on a star the properties of which depend to 95 per cent on pieces of physics which are completely unkown. Furthermore, Dark Energy is vacuum energy but with a special type of equation of state. This equation of state “tells” the vacuum how to react when it is put under pressure, for example. In the LCDM model it means that the energy density is always the same, in each volume element of space. Since the universe is expanding this implies that ever more energy is created with the creation of space, essentially out of nothing. But perhaps this is due to energy slipping into our universe from a higher dimensional space-time? Perhaps. But then energy conservatio, which has been a fundamental principle in physics so far, becomes an issue in general.
Also, astrophysicists actually do not understand how supernovae of type Ia (SNIa) work. There are variations of SNIa which are corrected for by using an observed correlation between the total energy and the time dependence of the luminosity of SNIa, but this correlation is not understood theoretically. This makes the usefulness of SNIa as standard candles for distance measurements questionable.This was aklso highlighted by Prof. Wolfgang Hillebrandt in his plenary talk at the same conference, where he explained that many SNIa seem to happen before theChandraseka-mass is reached, that there are several different ways to make SNIa and that this results in a possible huge scatter in the luminosities. Reducing this scatter artificially by using the supernovae as standard candles will thus have unknown consequences for cosmology.
II) There are unsolved problems at the fundamental level. An excerpt from the abstract of a paper by Prof. Tom Shanks published in 2005:
We note that the standard LCDM cosmological model continues to face fundamental problems. First, the model continues to depend wholly on two pieces of undiscovered physics, namely dark energy and cold dark matter. Then, the implied dark energy density is so small that it is unstable to quantum correction and its size is fine-tuned to the almost impossible level of one part in ~10^102; it is also difficult to explain the coincidence between the dark energy, dark matter and baryon densities at the present day. Moreover, any model with a positive cosmological constant also creates fundamental difficulties for superstring theories of quantum gravity. …
III) The model makes sufficiently good predictions to be testable as to how galaxies ought to look like, how the dark matter ought to be distributed in galaxies, around galaxies, and also in groups and clusters of galaxies. Unfortunately, this is where the model breaks down completely. Galaxies are simply not describable as objects that are dominated by dark matter. One could postulate the existence of a Dark Force which makes sure, by some principle we do not understand at all, that the Dark Matter always adjusts itself to do what the normal matter wants it to do. In this case we would have Einstein’s field equations with
- Dark Matter,
- Dark Energy and the
- Dark Force.
Summarising the above problems:
Assuming Einstein’s field equations to be vaild with some additional assumptions (which we will deal with in the future) leads to the following situation concerning the mathematical modelling of our Universe, given the observational data:
- The early expansion history seems to be wrong, unless inflation (which is not understood) is added.
- Given this, structure formation is not understood (it happens too slowly) requiring the addition of Dark Matter (which is not understood).
- With these additions, it turns out that the universe is expanding at the wrong rate today, unless Dark Energy (which is not understood) is added.
- Given all of the above, it turns out that on scales smaller than a few Mpc the observed structures to not correspond to the mathematical expectation, requiring a Dark Force to couple the baryons to the Dark Matter in some unknown way.
At this point it might be useful to reconsider gravitational physics to see if a simpler solution might not be available after all.
One aspect of this failure of the standard model is embodied in the Fritz Zwicky Paradox (see FZ Paradox I and FZ Paradox II). The usual argument invoked to avoid this failure is to say that the physics of normal matter is too badly understood to be able to test the model on the scales of galaxy groups and below. It is usually stated that the processes governing star formation and gas physics are far too complex. However, these are false statements, because the physical processes of normal matter are well enough understood to make this test of the standard cosmological model stringent on galaxy scales. Sometimes one also hears the statement that the standard “model” is only a fit to the data and that it cannot be used to make predicitions. But this would then mean that the standard model of cosmology were not physics.
The standard model of cosmology thus fails because galaxies do not work (this is the title of a future contribution in this series), and, ignoring this, it is nevertheless unsatisfying because of problems I and II above.
by Anton Ippendorf, Pavel Kroupa and Marcel Pawlowski (19.09.2010): “The standard model of cosmology” in “The Dark Matter Crisis – the rise and fall of a cosmological hypothesis” on SciLogs. We thank Joerg Dabringhausen for useful comments on the first version of this text. See the overview of topics in The Dark Matter Crisis.