How Can We Quantify Homogeneity in Physical Systems?

AI Thread Summary
The discussion centers on the challenge of quantifying homogeneity in physical systems, particularly in relation to particle distributions and the observable universe. Participants highlight the lack of a clear physical measure for homogeneity, suggesting that correlation functions could be useful in assessing how quantities like matter density vary across distances. The Cosmological Principle is referenced, with John Barrow's work emphasizing the small variations in Cosmic Microwave Background (CMB) temperature as evidence for uniformity in the early universe. The importance of grid size in statistical analysis is also noted, as it significantly impacts the perceived homogeneity of a system. Overall, the conversation seeks to identify existing studies or methods that effectively measure homogeneity in various physical contexts.
skippy1729
We all know what it means to be homogeneous in a "hand waving" sort of way. And, of course, there are abstract mathematical definitions for a homogeneous space. I have been unable to find a physical measure of homogeneity which could be applied to a ensemble of particles, box of rocks, or the observable universe. The measure should depend on the distribution of particles and the scale at which we do the averaging. I find it hard to believe that this has not been studied by someone. Any leads?

“When you can measure what you are speaking about, and express it in numbers, you know something about it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarely, in your thoughts advanced to the stage of science.” William Thomson, Lord Kelvin

PS to Moderators: I am posting this in the Cosmology forum. Please feel free to move it to a better spot if you feel it is appropriate.
 
Space news on Phys.org
I'm not an expert, but I imagine the simplest thing you could do is look at the correlation function. That is, how a quantity measured at a point x1 depends on the value of the same quantity at a point x2. If you have homogeneity, then I think it the correlation shouldn't depend on x1, but just depend on the distance between x1 and x2. The quantity here could be matter density (averaged over some appropriate scale), or CMB temperature. Someone please correct me if I'm wrong.

As I'm not an expert here, I'm not sure what studies have been done to test this.
 
I'm not expert either but as a retired math guy with an interest in it, I follow the cosmology literature. This question comes up from time to time. For example in 1988 John Barrow had this to say:
http://adsabs.harvard.edu/full/1989QJRAS..30..163B
It is four pages plus references. He talked it over with Martin Rees (the British Astronomer Royal) and came up with some measures of uniformity. This is handwavy too. We don't know in some absolute way why we tend to assume *The Cosmological Principle*.
there is no surefire watertight utter certainty about it. It's simple, and SEEMS right.

Anyway John Barrow is a fairly prominent astronomer/cosmologist and he wrote this article
*What is the principal evidence for the Cosmological Principle?*

And after some discussion he concluded that the most persuasive evidence supporting this convenient assumption of uniformity was THE SMALLNESS OF THE CMB TEMPERATURE VARIATION.

Very simply, you take the average and you discover that the variation from average is only ONE THOUSANDTH OF A PERCENT.
We have accurate measurements and know it is within 10-5

Barrows, back in 1988 had less accurate measurement and all he knew was that it was within 10-3 that is a TENTH of a percent. But, for him, that convincingly bespoke a uniform early universe. (That's really what the Cosmo Principle is about, the later local cobwebby clotting and coagulating is just random condensation. If it began overall uniform then it's basically still uniform at large scale.

So the quantity he picked was simply this: ΔT/T. the maximum variation from average T.

When you think of it, it is beautifully uniform. Nowhere more than a thousandth of a percent warmer or colder than the average. But there were tiny variations in density and temperature corresponding to sound waves in that nearly uniform hot gas, and they show up in the temperature map. And those tiny variations are believed to have been the seeds of later structure. The cloud in overdense regions would tend to collapse and underdense regions would tend to get cleared out, as stuff began to fall together.

The article is free online to read if anybody wants. There probably are less handwavy ones now. From time to time controversy breaks out: somebody thinks he's found a significant deviation from uniformity. And then after a while that doesn't get confirmed and goes away and the controversy quiets down. Overall uniformity has proven a fairly durable assumption.
 
skippy1729 said:
We all know what it means to be homogeneous in a "hand waving" sort of way. And, of course, there are abstract mathematical definitions for a homogeneous space. I have been unable to find a physical measure of homogeneity which could be applied to a ensemble of particles, box of rocks, or the observable universe. The measure should depend on the distribution of particles and the scale at which we do the averaging. I find it hard to believe that this has not been studied by someone. Any leads?
Typically what is done is something along the lines of drawing a grid on the thing in question.

For example, if we imagine, for the sake of argument, that the box of rocks is one meter on a side, then we might imagine our grid to be made up of cubic boxes 10cm on a side. There are 1000 such 10cm cubic grid sections inside the 1m box, which is quite enough for us to compare them to one another statistically. We could, for example, estimate the standard deviation of the density in each grid section. Or we might imagine taking the standard deviation of the average size of the rocks in each grid section. The standard deviation is basically a statement of how much the typical box deviates from the average. So this directly gives us a measure of how homogeneous the box is.

Note that in this example, the size of the grid that you pick matters quite a lot. If you pick a grid scale that is much smaller than the typical rock within the box, then you'll have some grid sections with nothing but air, and other grid sections with nothing but rock, so that there will be huge variation from section to section. The universe is much the same way: if you pick a grid size close to the size of the galaxy, you'll find a huge amount of variation from place to place. But if you pick a grid size of 80Mpc on a side, then the variation almost entirely disappears.
 
https://en.wikipedia.org/wiki/Recombination_(cosmology) Was a matter density right after the decoupling low enough to consider the vacuum as the actual vacuum, and not the medium through which the light propagates with the speed lower than ##({\epsilon_0\mu_0})^{-1/2}##? I'm asking this in context of the calculation of the observable universe radius, where the time integral of the inverse of the scale factor is multiplied by the constant speed of light ##c##.
The formal paper is here. The Rutgers University news has published a story about an image being closely examined at their New Brunswick campus. Here is an excerpt: Computer modeling of the gravitational lens by Keeton and Eid showed that the four visible foreground galaxies causing the gravitational bending couldn’t explain the details of the five-image pattern. Only with the addition of a large, invisible mass, in this case, a dark matter halo, could the model match the observations...
Hi, I’m pretty new to cosmology and I’m trying to get my head around the Big Bang and the potential infinite extent of the universe as a whole. There’s lots of misleading info out there but this forum and a few others have helped me and I just wanted to check I have the right idea. The Big Bang was the creation of space and time. At this instant t=0 space was infinite in size but the scale factor was zero. I’m picturing it (hopefully correctly) like an excel spreadsheet with infinite...
Back
Top