Horizon problem - why do we need inflation?

In summary, the horizon problem is a paradox in cosmology that states that regions in space that are separated by vast distances and moving away from each other rapidly, should not have similar properties. This is known as the homogeneity problem and it was initially thought that some amount of departures from homogeneity at very large scales would be observed, but the observations of the Cosmic Microwave Background (CMB) have shown that the tension with theory is growing out of control. The addition of quantum mechanics (QM) to general relativity (GR) suggests that small deviations could have occurred around the Planck time or earlier, which would have been greatly amplified by gravity and could explain the observed smoothness of the CMB. Inflation theory, which proposes
  • #36
Dmitry67 said:
For an observer moving thru that matter distributed 'randomly'
In the direction where she moves the Universe looks contracted due to Lorentz contraction.
Oops, sorry, that was a miswording on my part. I meant to ask for which observer would random initial conditions be isotropic, not anisotropic.


Dmitry67 said:
Occam.
Why are you introducing a preferred frame from the very beginning?
Why that particular fram is chosen?
Well, as I've said, I see no reason to introduce initial conditions at all. I suspect that for big bang events like our own, the preferred frame just happens to be set by the particular event that precipitates the new expanding region, a frame that is different for each such region.

Dmitry67 said:
Expansion itself does not require perturbations.
It does require non-zero vacuum energy, though, which you seem to have set to zero. Otherwise it requires some sort of matter, which means perturbations.
 
Space news on Phys.org
  • #37
Chronos said:
What seems highly improbable is they could be so remarkably similar in all directions.

This is what I am trying to understand. Why?

There are 2 options: initial conditions are simple and there are no parameters;
Initial conditions are complicated and there are many parameters;

Why between these options you chose the state with more information and complexity? For me it is less likely state.
 
  • #38
Chalnoth said:
Oops, sorry, that was a miswording on my part. I meant to ask for which observer would random initial conditions be isotropic, not anisotropic.

isotropic of course for an observer looking the the Universe from the frame you used to define these initial conditions.

Chalnoth said:
It does require non-zero vacuum energy, though, which you seem to have set to zero. Otherwise it requires some sort of matter, which means perturbations.

At least vacuum energy does not manifest directly (non-gravitationally) because its energy canceled by vacuum tension, and it is lorentz-invariant.

So I did not set it to zero. My requirement is just lorentz-invariance and minimum information (as less as possible free parameters)
 
  • #39
Dmitry67 said:
At least vacuum energy does not manifest directly (non-gravitationally) because its energy canceled by vacuum tension, and it is lorentz-invariant.

So I did not set it to zero. My requirement is just lorentz-invariance and minimum information (as less as possible free parameters)
In which case you don't have an expanding universe either, because the expansion breaks the Lorentz invariance.
 
  • #40
Look, there is no way around it. If you want just the standard big bang to explain the horizon problem you are forced, by hand, to pick extremely special highly fine tuned values. That requires knowledge of every single perturbation that may enter the picture, at almost every scale and at every point in space.

It is a fact that any random set of initial conditions will assuredly not produce this result, and those that do are measure zero. Further, the observations cannot be explained by any simple choice of initial conditions, like say perfectly homogenous density profiles.

The analogy would be the following. Imagine that 10 years ago, when extra solar planets started to be discovered, the observation was that everywhere we found them, they looked identical to the earth. Same mass, same temperature, same chemical composition. Obviously, there is no reason that should be the case, so people would look for a hidden mechanism, perhaps something they overlooked about planet formation and not simply fall back upon initial conditions.
 
  • #41
Haelfix said:
The analogy would be the following. Imagine that 10 years ago, when extra solar planets started to be discovered, the observation was that everywhere we found them, they looked identical to the earth. Same mass, same temperature, same chemical composition. Obviously, there is no reason that should be the case, so people would look for a hidden mechanism, perhaps something they overlooked about planet formation and not simply fall back upon initial conditions.

I see what your problem is.

I agree, it would be very strange to find out that all these planets are identical! I agree with you. Because these planets had a HISTORY, an amount of PROPER TIME, and knowning that QM looks random (from observers perspective) it is very unlikely that different systems (even starting from the same conditions) end in the same final state.

Note this common-sense logic is not applicable to the initial conditions, because these systemes, these different regions did not experience any proper time before they are compared! Check my bold. There WAS NO REASON because there was no time before it!

Their state is axiomatically evolved, determined by some function. So it is logical to expect something simple (density=0) and it is unlikely to see something complicated, like, BM particles positioned in a form of unicorns.
 
  • #42
One of the worst enemies of clear thinking is the illusion that we think we know more than we actually know. Without an actual model to compare to, we can't know whether one part in 100,000 is "high" or "low" in the single-example universe. It accomplishes nothing to say that you have a feeling that the observed number is too low.

Looking back in time, the Universe was smoother and smoother. Perhaps the information density (in some non-rigorous sense) approached zero at t=0. We don't yet know. If this was the case, then GR might predict perfect uniformity now (which is of course wrong).

One analysis that could predict a quotable number for fluctuations (at t > 0) is gaussian statistics. A sample with 10120 photons (such as the observable Universe) might have an initial fractional deviation of one part in 1060. I think this is sufficient (if this deviation were present at say the Planck time) to cause either collapse or runaway expansion by now. Smaller samples representing 1/10 or 1/100 of the Universe would have larger fractional deviations.

This 1060 analysis is probably wrong because the gaussian count fluctuation applies to a system where the countable elements are in contact, and they were probably not in this case. However, if it were a valid analysis, it would yield a number that suggests inflation is necessary.
 
Last edited:

Similar threads

  • Cosmology
Replies
1
Views
997
Replies
13
Views
3K
  • Cosmology
Replies
17
Views
1K
Replies
37
Views
3K
Replies
9
Views
2K
Replies
6
Views
2K
Replies
6
Views
2K
  • Cosmology
Replies
4
Views
1K
Replies
11
Views
2K
Replies
2
Views
1K
Back
Top