Perlmutter & Supernovae: Debunking the Myth of Accelerating Galaxies

  • Thread starter Thread starter gork
  • Start date Start date
  • Tags Tags
    Supernovae
AI Thread Summary
The discussion centers around the interpretation of Perlmutter's findings on the acceleration of galaxies, which suggests that distant galaxies are moving away from us faster than closer ones. Critics argue that this interpretation may be flawed, as it relies on observations of light from the past, which complicates the understanding of current velocities. The Hubble Law, established over 60 years ago, indicates that the rate of expansion is proportional to distance, but does not directly imply acceleration. Some participants emphasize the need for long-term observations to confirm changes in redshift over time, rather than relying on snapshots. The conversation highlights the complexities of measuring cosmic expansion and the ongoing debate about the role of dark energy in the universe's acceleration.
gork
Messages
11
Reaction score
0
I've been an amateur quantum physicists for most of my life, and ever since 1998 I've been wondering about this issue, but I figured someone would address it. Now they've given the Nobel to the guy and I still don't understand something.

Perlmutter says that galaxies are accelerating away from each other. He bases this on the fact that things that are farther away from us are moving faster than things which are closer. The problem is that we see things farther away from us as they were farther in the past. So quasars at the edge of the visible universe were traveling at .9c 13.7 billion years ago. Galaxies half as far away were traveling half that speed 7 billion years ago, or whatever the numbers are. Andromeda is actually moving closer to us and that is still 2.5 million years ago. The evidence seems to me to indicate, not that things are accelerating, but that they are slowing down. We have no idea what speed those distant galaxies are moving at right now. They could be moving closer to us by this point.

Can anyone show me where I'm wrong on this?
 
Astronomy news on Phys.org
Perhaps the answer lies in relativity itself. The measurements seem to show that the farthest objects that can be seen in the universe are accelerating away from us. But, since motion is relative, it also means that we are also accelerating in our motion through spacetime. We cannot measure our own motion so easily as we can very distant objects, so we need to use indirect inferrence.

By the way, it is said that we should regard the expansion rate of the universe as generating true kinematic motion, not just a "stretching" of spacetime. The idea here is that we shall avoid confusion this way. So, it is perfectly proper to speak of the expansion of the universe as if it gave rise to actual motion.
 
Gork, all I can say is that everything supports our observations. Einsteins General Relativity actually PREDICTED this possibility before we ever observed it. I would recommend visiting wikipedia and hitting up its articles on expansion of the universe, hubbles law, and general relativity if you are actually interested in learning about this.
 
gork said:
...
Perlmutter says that galaxies are accelerating away from each other. He bases this on the fact that things that are farther away from us are moving faster than things which are closer...

Can anyone show me where I'm wrong on this?

Well your first statement is wrong. Perlmutter et al do not base the inferred accel. on that fact.

That fact is contained already in Hubble law which we have had for over 60 years.

Think about the standard statement of Hubble law v = HD. The standard formulation does not involve redshift directly because no simple relation between current expansion rate and observed redshift (only approximate for small distances.)
D is a distance now---what you would measure if you could stop expansion at this moment and use some conventional means like radar.
v is the rate that distance is expanding now.

That law is implicit in the definition of H. It is a'/a as comes up in the Friedman equation.
This is all really classical! Friedman equation goes back to 1920s! You've seen v = HD hundreds of times I expect.

v = HD already says the rate of increase is proportional to length. The longer the distance D is, the faster it increases. That goes for any moment in time on the Friedman model clock (the universe time the cosmologist's model runs on.)

Perlmutter et al result was that a' is increasing. The time derivative of the scalefactor a(t) is increasing.

Your statement is incorrect because they did NOT base this conclusion on the fact (known for over 60 years) that longer distances grow faster.
===========================

You might want to back up and ask what DID they base it on? What slight difference from what they expected? What slight adjustment in the model was required to fit the new observations? Various people might want to lend a hand in explaining.

BTW the law v = HD is exactly true, in that form, in an ideally uniform universe that obeys General Rel. Uniform in the sense of homog. and isotr. It is not just an empirical law based on observation only. And the universe so far seems to be remarkably close to uniform at large scales. One reason that Hubble law is interesting. In its standard formulation, confirmed by observation, it is evidence for the correctness of Gen Rel.

======================

Gork, just an afterthought. In the largescale pattern of distance expansion which we are witnessing, nobody gets anywhere. So it is significantly different from ordinary motion (in some local frame of reference) that we are familiar with--e.g. motion with a definable destination that is approached.

So the language you use ("farther things moving faster") is a bit misleading. It will tend to confuse you if you use that figure of speech too much. It is better to simply say "longer distances increase faster". That is more neutral and more in keeping with the math picture of the universe cosmologists normally use.
 
Last edited:
Gary Kent said:
Perhaps the answer lies in relativity itself. The measurements seem to show that the farthest objects that can be seen in the universe are accelerating away from us. But, since motion is relative, it also means that we are also accelerating in our motion through spacetime. We cannot measure our own motion so easily as we can very distant objects, so we need to use indirect inferrence.

By the way, it is said that we should regard the expansion rate of the universe as generating true kinematic motion, not just a "stretching" of spacetime. The idea here is that we shall avoid confusion this way. So, it is perfectly proper to speak of the expansion of the universe as if it gave rise to actual motion.

Gary I'm not sure this is correct. For one I don't think our own motion through spacetime can be said to be accelerating or decelerating.

Unfortunately I don't have time to look up why at the moment as I have delicious tacos to go eat!
 
v=HD is still based on Hubble's redhsift observations, which still has the same fundamental problem as Perlmutter's, in that we only know the velocity and distance of galaxies, and that galaxies at longer distances were, in the past, moving faster than closer galaxies in the more recent past. This is, incidentally, exactly what the universe would look like if it expanded at near-light speed at the time the first galaxies were forming and slowed down over time. You can't say that just because v=HD is old that it must be right. You can't measure acceleration, a function of time, by taking a snapshot, particularly one that is 13 billion years old.
 
Are you certain of that gork? If it were that easy then the current model would not have developed as it did.
 
Well, you know that deferring to research isn't a valid argument. I've looked through every article I've found and nothing can explain what I'm talking about, so if anyone on this forum can, then I'd be glad to accept it as someone who knows more than I do explaining the issue, but so far people have just said "that's the way it is" or "Hubble Hubble Hubble".

If I see two cars in front of me, one is going 10 mph and is 100 ft away and another is going 90 mph and is 900 ft away, and I take a photograph of them and write their speeds at the time of the photograph, does that photograph prove the farthest car is accelerating faster than the nearer car? See, in my example, velocity is a function of distance, but only by coincidence. The nearer car could easily be accelerating and the farther car breaking, but my instantaneous snapshot will not record that data.

You would need to take data from many galaxies over a period of years indicating that the redshift was increasing over time, and that the amount of increase was proportional to their distance from us.

Can anyone tell me how I'm wrong about this without simply invoking Hubble?
 
gork said:
You would need to take data from many galaxies over a period of years indicating that the redshift was increasing over time, and that the amount of increase was proportional to their distance from us.

But that is EXACTLY what has been done and it has nothing to do with Hubble. What you seem to not be taking into account is that the observed objects are NOT just farther away, they also represent different points in time, so we have a large SET of snapshots taken at lots of different times.
 
  • #10
I think gork wants to look at individual objects over a period of time in order to see the redshifts of individual objects increase. We can't do that yet. From

http://arxiv.org/abs/0802.1532:
we find that a 42-m telescope is capable of unambiguously detecting the redshift drift over a period of ~20 yr using 4000 h of observing time. Such an experiment would provide independent evidence for the existence of dark energy without assuming spatial flatness, using any other cosmological constraints or making any other astrophysical assumption.

Also, redshifts of individual objects don't necessarily increase with time. Figure 1 from the above paper plots redshift versus time. The three red curves are for objects in our universe. As we watch (over many years) a distant, high redshift object, A, we will see the object's redshift decrease, reach a minimum, and then increase. If we watch a much closer, lower redshift object, B, we see the object's redshift only increase.

Roughly, when light left A, the universe was in a decelerating matter-dominated phase, and when light left B, the universe was in the accelerating dark energy-dominated phase.
 
  • #11
If you go to the Supernova Cosmology Project website, you'll find actual data:

http://supernova.lbl.gov/Union/figures/SCPUnion2_mu_vs_z.txt

The data are redshifts and distance moduli for type Ia supernovae. Redshifts are relatively easy to measure, given our assumption that we know what the emission spectra look like is true. Distance moduli (called mu) for large redshifts (z) are what we find so valuable. For that you measure how bright an object appears (apparent magnitude, m) and assume you know its absolute magnitude M (how bright it would appear if you were only 10 parsecs away, i.e., the app mag at 10 pc). The trick is to know M for type Ia supernovae. That's what they figured out how to do (that number is at top of data, ~ -19). Anyway, if you know how bright the object would appear at 10 pc and you know how bright it appears now, then you can calculate how far away it is (although, you have to assume a particular cosmology model). Ok, so you plot m - M (mu) vs z and try to find a cosmology model to fit that data. The best fit cosmology model for that data is a spatially flat general relativity model filled with pressureless dust with a cosmological constant (Lambda) where the total energy density is divided 70% Lambda and 30% matter. According to that model, the universe started with a decelerating expansion rate when matter dominated and switched to an accelerated expansion rate once Lambda dominated. Thus, you read that dark energy (Lambda) is driving an accelarated expansion rate for the universe. Of course, that's not what they actually measured, i.e., they didn't measure Lambda nor did they measure an accelerating expansion rate, they measured mu vs z, Lambda and accelarated expansion are artifacts of the best fit cosmology model. There are attempts to fit the data without dark energy or accelarated expansion. One was just published last month in PRD, the reference was on Physics Forums, sorry I don't have it on this computer. Another is inhomogeneous spacetime. Still others are modifications of general relativity (f(R) gravity I believe is one such attempt). Hope this helps.
 
  • #12
RUTA said:
If you go to the Supernova Cosmology Project website, you'll find actual data:

http://supernova.lbl.gov/Union/figures/SCPUnion2_mu_vs_z.txt

...

Good idea! It's great to see the actual SNe data that had to go on. Basically you could say that model fitting, at the simplest level, just involves adjusting 3 numbers (today's matter fraction, cosm. const. fraction, and Hubble rate).

And you are showing me hundreds of numbers. So I have to adjust 3 parameters to get the best fit. It is a good lesson.

People don't realize how tightly constrained the whole thing is, since the model derives from the GR equation that has been checked in so many ways and passed so many tests with exquisite precision. As long as a GR model with only 3 adjustable parameters can fit data nicely people are not going to be too interested in alternatives.

(Present company excepted I mean, if some guy were to offer a model with only 2 parameters, now, and it could fit the data as well! :biggrin:)
 
  • #13
Yes, if you can beat this model (called LambdaCDM), then you might get some attention (depending on how you did it). There are really only two parameters in LCDM, since the matter and Lambda contributions must sum to the critical density (which makes the model spatially flat). So, you'd have to beat a 2-parameter model. And, there is data besides the supernova data that would have to be accounted for. I don't think there is any other model that can do all that. The alternatives I told you about all have some difficulty with fitting everything.
 
  • #14
That's right. there is a lot of evidence that the U is SPATIALLY not curved or very nearly so. So if we accept that, then there are just two parameters. The current Hubble of course, 71, and the other two matter density .27 and Lambda density .73 forced to add up to one so really only 2 free parameters.

http://supernova.lbl.gov/Union/figures/Union2_Hubble_slide.pdf

So their data amounts to relation between two MEASURED quantities. Hundreds of datapoints. Each datapoint is a pair of numbers (z, mu) where z is observed redshift and mu encodes the distance

Like, looking at the plot there, z = .2 corresponds to around mu=39.9 which by convention means about 2.6 billion lightyears ( "now" distance)

And z = .4 looks like it corresponds to about mu=41.7
which by convention is 5.1 billion lightyears.

And the present LambdaCDM model fits that very nicely with the three (or two) parameters just mentioned adjusted as we said: .27, .73, 71.

The nice thing from my very personal perspective is that gauging the luminosity and allowing for the observed wavelength stretch gives you by the inverse square law just the distance you want! Namely the freezeframe proper distance NOW which you would measure if you could stop the expansion process at this moment and use any conventional means. It does not give some confusing pseudodistance like light travel time. It gives the actual present day distance distance. And we are DIRECTLY MEASURING. I love it. Others may disagree :biggrin:

proper distance now = 32.6*10^(mu/5)/(1+z) lightyears.

Great to see all that direct measurment data, RUTA!

This gets nailed to a virtual wall somewhere:
http://supernova.lbl.gov/Union/figures/SCPUnion2_mu_vs_z.txt

They say their presentation is dated 2010, but it looks rather similar to some of the figures and tables in this 2008 preprint, so I offer that to provide discussion if anyone wants:
http://arxiv.org/abs/0804.4142
 
Last edited:
  • #15
This is all well and good, but I think gork is actually asking a much simpler question. He is asking why the Hubble law is not interpreted as deceleration, period-- that question really doesn't have anything to do with acceleration or Lambda CDM! It is true that the longer ago we see some object, the faster it is moving away from us. Why doesn't that mean the expansion is decelerating?

The answer is called "the cosmological principle." This principle is the key unifying and simplifying factor behind all cosmological models-- it is the thing that allows us to take individual "snapshot" of different eras of our past, at different distances from us, and cobble them together into a global description of the history of our universe. The cosmological principle asserts that the universe is everywhere the same, except that it changes with time-- it ages, everywhere the same.

So when we look at galaxies at different redshifts, two things are happening-- we are seeing different times in the past, and we are seeing different distances away. gork is focusing only on the former issue-- if everything in the universe was moving away from us at thesamespeedateach age (myspacebarisn'tworking), thenhe'dberight--butthatwouldn'tobeythecosmologicalprinciplebecausethenwe'dbe inaspecialplace.
 
  • #16
KenG, here is Gork's question in post #1. I replied to it and explained where he was wrong. This was what he was asking for. But he did not get back to me
gork said:
...
Perlmutter says that galaxies are accelerating away from each other. He bases this on the fact that things that are farther away from us are moving faster than things which are closer...

Can anyone show me where I'm wrong on this?

In post #4 I replied and pointed out that the Perlmutter et al result was NOT based simply on larger distances increase at (proportionally) larger rates. That by itself does NOT imply acceleration.
We've known that for generations. Gork's first mistake was to make a kind of straw man of Perlmutter et al and suggest that they claimed acceleration based on what would be incompetent reasoning which everybody knows does not imply it.

Gork has not yet acknowledged the error in #1. So the discussion does not go anywhere.

My advice to Gork would be to make a fresh start and say: "OK I was wrong. Perlmutter et al result was NOT based on longer distances growing faster. That by itself does NOT imply accel and nobody claimed it did. Now how did Perlmutter et al deduce that the parameter Lambda in the model is positive?"

I suggest he ask a real question like: "How DID they figure that Lambda > 0?"

Note that acceleration is a circumstance that goes along with positive Lambda as one of the ways Lambda manifests itself. The public does not know or care about Lambda. But the public can understand acceleration, so acceleration was the exciting way they publicized the finding. It communicates. And they also got some attention when somebody thought to call Lambda "dark energy". That is mainly in the realm of PR. Lambda is a curvature and if you drag it over to the other side of the equation (not a natural place for it IMHO) it acquires units of energy. Any constant curvature on the LHS of the Einstein equation would do likewise.

What we are really talking about is an important constant of nature, like Newton G or Planck h. It is the other constant in the GR equation---our prevailing well-tested law of gravity. We are at the next stage of refining today's law of gravity.

It does not help anybody to insist on staying down in the level of PR and pop-sci.
 
Last edited:
  • #17
marcus said:
In post #4 I replied and pointed out that the Perlmutter et al result was NOT based simply on larger distances increase at (proportionally) larger rates. That by itself does NOT imply acceleration.
Yes, and you were quite correct. But it just means that gork did not present his question clearly. His subsequent posts show that his real issue was not with the acceleration, nor the Hubble law, but the fact that deceleration is not inferred from the Hubble law. My point is that if we did not have the cosmological principle, but instead a model where the Earth was at the center and everything in the universe moved at the same time-varying speed away from us, then we would indeed conclude that the universe was decelerating. All we need to not conclude that is the cosmological principle, which immediately creates a different interpretation of the Hubble law (along the lines of what you explained quite correctly but I don't think gork would get the punchline without the cosmological principle).

I suggest he ask a real question like: "How DID they figure that Lambda > 0?"
I agree, but first we must establish the more basic aspect of the question, which has nothing to do with Perlmutter. It's crucial that gork understand why the Hubble law is not interpreted the way he imagines. I'm not saying anything in your answer was wrong, only that I think the question is at that more basic level.
What we are really talking about is an important constant of nature, like Newton G or Planck h. It is the other constant in the GR equation---our prevailing well-tested law of gravity. We are at the next stage of refining today's law of gravity.
That's all true, I just think gork's incredulity about acceleration stems from a more basic issue about what the Hubble law means. There is a situation where his interpretation could be perfectly correct, so we need to look at why that situation is not the one that gets used-- it's because it wouldn't satisfy the cosmological principle, it would put the Earth at a very special place in the universe, and wouldn't obey GR with or without Lambda. But everything you said is certainly both correct and useful for understanding the modern state of affairs, I'm not criticizing.
 
  • #18
George and RUTA, thank you very much for your explanations. I understand now.

Marcus, quit trying to get an apology from me. You never explained anything, just told me I was wrong because someone smarter than me already figured it out. You weren't helpful at all, so stop pretending you were. I am smart enough to understand it once it was explained. You never tried. So I won't say I was wrong, because I wasn't. I was asking a question that you didn't answer multiple times. George and RUTA answered it thoroughly and were very helpful. And without being pedantic. You could learn a lot other than physics here.
 
  • #19
Still, I wonder if you understand now, gork, that none of the things you find in those detailed explanations of how the redshift observations are interpreted would be possible without the cosmological principle, and why it is the cosmological principle, not those details, that answer why a Hubble law does not mean the expansion is decelerating. That's the only issue here that isn't "black boxy", that isn't "this is what the people get when they make detailed models", which isn't all that satisfying by itself. You also have to bear in mind that marcus was probably reacting to all the people who come on here and say "I think cosmologists are wrong because they overlooked some really basic point", which is not what you did (you framed it as "what is wrong with this reasoning", and the answer is, "it doesn't satisfy the cosmological principle"). Also, bear in mind that everything he said was a correct explanation, and that it took him time to lay out that correct explanation, so you should still thank him even if it rubbed you the wrong way that he wanted you to recognize the misconception behind your question. Often, we find it is important to "kill the misconception" in order to achieve real learning, but that can sometimes come out sounding like "kill the questioner", leading to "kill the answerer."
 
  • #20
I should have explained my question more clearly, but I didn't realize that people still thought of Earth as being the center of the Universe. I didn't know the specific jargon, but I already realized that the Earth is moving away from everything else the same way it is moving away from us. I like the example of drawing points on the surface of a balloon and then inflating the balloon, but then imagining the surface of the balloon is 3-dimensional. But yes, I recognize that to an observer in a galaxy redshifted from our point of view the Milky Way would be equally redshifted. That was never the issue. The issue was mostly that I didn't recognize that Perlmutter was taking that snapshot I was talking about and figuring out that the Universe is bigger than it would've been had the Universe not accelerated. That's something that none of the articles I read on the topic actually mentioned.

And I understand that most people who post questions like this on here might not understand that answer were it given to them. I actually started studying astrophysics when I was 12, but I went to college to be an art teacher, and kept physics as a hobby that I take pretty seriously.

Thanks again for the help!
 
  • #21
gork said:
I should have explained my question more clearly, but I didn't realize that people still thought of Earth as being the center of the Universe.

gork, this is a great forum for learning physics and I think you'll find it a worthwhile place to be. However, your pissy attitude and statements like the one above are not going to endear you to the moderators, and Marcus goes out of his way to be helpful to all who post here, so if you plan to stick around, you would do well to rethink your attitude.
 
  • #22
gork said:
I like the example of drawing points on the surface of a balloon and then inflating the balloon, but then imagining the surface of the balloon is 3-dimensional.
The one I like is to take a transparency with galaxies drawn on it, and another that is just the same except it has been put through a copying machine and expanded a bit, maybe 10%. Then you pick a galaxy that is the same on both viewgraphs, color them in blue, and overlap them, and say "here's our galaxy in the past and now, and look how the other galaxies have moved farther away in the mean time." But then here's the kicker-- you pick another galaxy, color it red on both viewgraphs, and ask what the situation will look like from that galaxy. Then you move the viewgraphs so as to superimpose the red galaxies, and sure enough all the rest, including the green one, look like they have moved away from the red galaxy. It also shows that the farther away ones have moved a larger distance.
 
  • #23
Phinds, I'd just like to point out that Marcus, having not answered my question, responded to me re-asking it by insulting me. I didn't get a pissy attitude until I felt he was unjustly sniping at me. I also won't be posting much here, but that's only because I don't have many things to ask. This really was the only nagging question in physics for me. So you won't have to worry about me coming back around. I used to post on forums a lot, and I've met a Marcus on every forum. People like him have kept me from using forums to communicate my ideas for years.
 
  • #24
gork said:
Perlmutter says that galaxies are accelerating away from each other. He bases this on the fact that things that are farther away from us are moving faster than things which are closer.

He bases this on the fact that things that are farther away are moving *much* faster than things that are closer. You can create a universe in which things are moving away at constant expansion, and it turns out that the speed is larger than constant expansion.

The problem is that we see things farther away from us as they were farther in the past.

And they've taken this into account.

It turns out that when you look at large distances, there are about six or seven different definitions for distance and speed, and they've taken into account all of this bookkeeping.

http://en.wikipedia.org/wiki/Distance_measures_(cosmology)

So quasars at the edge of the visible universe were traveling at .9c 13.7 billion years ago. Galaxies half as far away were traveling half that speed 7 billion years ago, or whatever the numbers are.

The exact numbers are important. You can calculate what we expect to see if the universe were constantly expanding and it turns out that the galaxies are expanding faster than that.
 
  • #25
One thing about popular descriptions of science is that they usually leave out the "messy details." It turns out that in order to deal with the cosmological distances, you have to go through about six pages of math. It's not particularly complicated math (algebra and basic calculus), but most popular treatments leave it out.
 
  • #26
Marcus gave you a good answer, gork, and was merely correcting your misunderstanding. I did not perceive anything 'snippy' about his response.
 
  • #27
twofish-quant said:
One thing about popular descriptions of science is that they usually leave out the "messy details." It turns out that in order to deal with the cosmological distances, you have to go through about six pages of math. It's not particularly complicated math (algebra and basic calculus), but most popular treatments leave it out.

Another problem with popular descriptions is what they claim to have been "measured." In this case, accelerated expansion was not measured. Spectra and apparent magnitudes were measured. These were converted to redshifts (z) and distance moduli (mu) by assuming knowledge of the emitted spectra and absolute magnitudes. No one disputes these two assumptions so it is widely accepted that mu versus z is "data." This data can be explained without accelerated expansion and therefore without dark energy. However, even the Nobel committee citation said these guys discovered "the accelerated expansion of the universe." So, even the Nobel prize has become sensationalized for popular consumption :frown:
 
  • #28
RUTA said:
These were converted to redshifts (z) and distance moduli (mu) by assuming knowledge of the emitted spectra and absolute magnitudes. No one disputes these two assumptions so it is widely accepted that mu versus z is "data."

In fact lots of people disputed those assumptions, and if you read the original paper, most of it was to go through point by point every possible alternative explanation and give very good reasons why those alternative explanations have problems.

This data can be explained without accelerated expansion and therefore without dark energy.

Not without assuming something even weirder. If you assert that the universe isn't accelerating, that means that we have something *REALLY* wrong. GR is garbage, photon behave weirdly, the universe is outside the planet Pluto is an illusion by space aliens. Arp and Hoyle are right. The universe was created on October 13, 6006 BC.

Doesn't matter. If it turns out that the universe isn't accelerating and what is happening is even weirder, then they deserve the Nobel.
 
  • #29
twofish-quant said:
In fact lots of people disputed those assumptions, and if you read the original paper, most of it was to go through point by point every possible alternative explanation and give very good reasons why those alternative explanations have problems.

"disputed?" past tense? Do you have any references for recent objections to the data?

twofish-quant said:
Not without assuming something even weirder. If you assert that the universe isn't accelerating, that means that we have something *REALLY* wrong. GR is garbage, photon behave weirdly, the universe is outside the planet Pluto is an illusion by space aliens. Arp and Hoyle are right. The universe was created on October 13, 6006 BC.

Doesn't matter. If it turns out that the universe isn't accelerating and what is happening is even weirder, then they deserve the Nobel.

There are other explanations such as dark flows (Tsagas, C.G.: Peculiar motions, accelerated expansion, and the cosmological axis. PHYSICAL REVIEW D 84, Sep 2011) and inhomogeneous spacetime (Clarkson, C., Maartens, R.: Inhomogeneity and the foundations of concordance cosmology. Classical and Quantum Gravity 27, 124008 (2010). arXiv:1005.2165v2). Whether or not these are "weirder" than dark energy is subjective. My point stands -- accelarated expansion was not measured and does not follow necessarily from the data.
 
  • #30
RUTA said:
"disputed?" past tense? Do you have any references for recent objections to the data?

Unpublished astrophysics lunch time conversations.

Right after the initial data came out, I remember a ton of conversations in my department with people trying to find holes in the conclusions. The problem is that we did a lot of brain storming about things that could be wrong with it and "dark flows" and "voids" did come up a lot. The trouble is that after working on it for a few days to weeks, every thing everyone came up with would not work, and coming up with an objection to the results that does not work is not publishable.

What you see in the published literature is just the tip of the iceberg. If you do a measurement and come up with a theory that says that there is an objection to the results that does not work, then it's not going to be published because it's not interesting.

There are other explanations such as dark flows (Tsagas, C.G.: Peculiar motions, accelerated expansion, and the cosmological axis. PHYSICAL REVIEW D 84, Sep 2011)

I'm 98% sure that the scenario in that paper is excluded by WMAP measurements. I'm not a WMAP expert, but I know someone that is, and the next time I have lunch with him (which might be a while since I'm not at the university), I'll bring up the numbers in the paper, and I'm 98% sure he'll tell me that it's excluded by WMAP. The reason I'm 98% sure about this is that this is a conversation that has been done before.

inhomogeneous spacetime (Clarkson, C., Maartens, R.: . Inhomogeneity and the foundations of concordance cosmologyClassical and Quantum Gravity 27, 124008 (2010). arXiv:1005.2165v2).

Also Whitshire come up with his timescape involving voids. But...

That paper you referenced does not present an alternative model for the accelerating universe. It just talks in general about non-homogenous models, and doesn't present anything that would call the results into question. And Riess et. al already thought of this. To remember the possibility of a local void, they reran the results, dropping the all of the nearby supernova and they came up with the same answer.

Way, way back in 1998, people in my department (and I'm sure every other astronomy department) were talking about the possibility that the results were due to local voids. It turns out that this won't work because you can limit the size of voids by galaxy motions and gravitational lensing, and even after you put the maximum void that the data allows, the effect still doesn't go away.

None of this stuff got published because none of it was interesting. It someone had gotten a void or a dark flow model to work, then that would have been publishable, but no one did, and it wasn't for lack of trying.

My point stands -- accelarated expansion was not measured and does not follow necessarily from the data.

Which data? If you take into account all of the other data that exists, no one has come up with a better explanation. Now if you take the original data by itself, sure there are lots of alternative explanations, but they disappear with all of the new data that's come in.

The two papers that you cite don't present viable alternative models. Clarkson/Maartens don't claim that their research challenges acceleration, and Maartens has written several papers on dark energy. I'm pretty certain that the Tsagas scenario is excluded by WMAP. Also you can find out a lot about people by having lunch with them. There's nothing in any of the papers that suggests that the authors themselves think that the universe isn't accelerating.

It looks like a duck. It quacks like a duck. We've established that it's not a penguin, chicken, or blood hound. It could still be a goose, but right now people are trying to figure out what kind of duck it is.

But sometimes a duck is a duck.
 
  • #31
Interesting info, twofish-quant, but you did not refute my assertions concerning the mu vs z data from type Ia supernova (only data I have referenced herein).

1. No one disputes the data. Discussed? Questioned? Double-checked? Sure. Disputed after such consideration? No. If anyone had reason to dispute the data those reasons would certainly be published.

2. The data does not constitute a measurement of accelerated expansion.

3. There are explanations of the data that do not involve accelerated expansion. Whether or not these explanations account for WMAP data is not part of the claim. The reason for bringing this up is to verify point 2.

When I teach cosmology for non-science majors or science majors, I try to stick to the facts, I even share with them the Observational Indeterminancy Principle of Cosmology . [Michael Heller, Theoretical Foundations of Cosmology, World Scientific Press, Singapore, 1992, pp. 81-82.]. I present what has been measured, what assumptions are required to turn those measurements into data, and what explanations have been rendered to account for that data. I do the same on Physics Forums. If I'm missing facts, please give me the references, I will change my lectures and posts accordingly :smile:
 
  • #32
RUTA said:
Interesting info, twofish-quant, but you did not refute my assertions concerning the mu vs z data from type Ia supernova (only data I have referenced herein).

One reason that people don't get Nobel prizes immediately after making the discovery is so that you can let new data in. If you had *only* the original data, then yes, it might be one of several dozen other things, but as new data has come in, it's becoming more clear that they actually were the first people to record the accelerating universe.

There are explanations of the data that do not involve accelerated expansion.

And none of those explanations work when you pull in other data. The two main explanations that you've brought up are "dark flows" and "dark voids."

In order to come up with a model of dark flows that work, you have to assume nearby accelerations that have been ruled out by WMAP. If you assume that the dark flows are far away then you run into the problem that you don't have angular dependence.

In order to come up with an explanation that involves voids, you can use observations of the sizes of known voids. If you remove the data that could be influenced by a nearby void, you still get acceleration. If you assume that the voids are far way, then you should see a change with direction and you don't.

What you can do (and it's a great calculation for a first year graduate student) is to calculate the maximum possible effect due to "dark flows" or "dark voids" and it turns out when try to take those into account, you still get acceleration.

Whether or not these explanations account for WMAP data is not part of the claim.

I don't see the point of excluding data in figuring out what is going on. I also don't see much of a point in getting into semantic arguments. If you want to get nit-picky, my speedometer measures the rotation rate of my transmission, a thermometer measures the expansion rate of mercury. I don't see the point of getting nit-picky here.

When I teach cosmology for non-science majors or science majors, I try to stick to the facts

And there are a ton of facts.

I present what has been measured, what assumptions are required to turn those measurements into data, and what explanations have been rendered to account for that data. I do the same on Physics Forums. If I'm missing facts, please give me the references, I will change my lectures and posts accordingly :smile:

I just did. I should point out that a lot of what I've done are things that I haven't calculation myself, and come out of lunch time conversations. I've mentioned that my alma mater is the University of Texas at Austin, and it wouldn't be too hard to send e-mail to people in the cosmology department there to find out what exactly they think of the supernova data.

I haven't personally done the WMAP calculation, but I know someone that has, and I'm too busy right now to reproduce it, although it would make a great homework problem for an intro cosmology course.

I'm fairly certain by this point that someone has written it up in some review paper, and writing a term paper that describes what the current state of research is on dark voids and dark flows is also a great homework problem.
 
Last edited:
  • #33
Also here is a reference to a blog that talks about the Tsagas paper

http://scienceblogs.com/startswithabang/2011/10/dark_energy_dark_flow_and_can.php

In fact, I know people that did this calculation when the supernova data first came out, and once you figure out that dark flow can't explain acceleration there is no point in publishing. Note that Tsagas himself says that his model can't explain all of the acceleration (which is what people I know also figured out). The maximum decleration parameter that Tsagas has gotten is -0.3, whereas what is observed in -0.5.
 
Last edited by a moderator:
  • #34
Twofish, you still aren't getting my point, so I'll make it simple. Either the supernovae data in and of itself logically entails accelerated expansion or it doesn't. If it does, show me the proof (your pedagree and lunch room conversations will not suffice). If it doesn't, I will not propagate the myth that this data constitutes a discovery of the accelerated expansion of the universe.

Edit: The equivalent of claiming the supernova data constitutes the discovery of accelerated expansion is to say Brahe discovered that the planets went around the Sun while the Sun and Moon went around Earth. I would not give the Nobel to Brahe for discovering the Tychonic system, I would give it to him for his unprecedented astronomical measurements. Kepler later used those measurements in the context of a completely different cosmology model. Likewise, Perlmutter et al should get the prize for what they actually did, i.e., obtain distance moduli for large redshift supernovae, not for what they inferred from that data, i.e., accelerated expansion. Who knows what cosmology model that data might be used to support in the future.
 
Last edited:
  • #35
RUTA said:
Twofish, you still aren't getting my point, so I'll make it simple. Either the supernovae data in and of itself logically entails accelerated expansion or it doesn't.

That just doesn't make any sense to me. The supernova data is just one piece of the puzzle and has to be understood in the context of other data. In order to do anything with the supernova data, you have to make hundreds of assumptions, and there is no way that you can justify those assumptions without reference to other data.

If you had only the supernova data, then you could come up with a lot of alternative explanations. It turns out that none of those observations work in light of other data. If Perlmutter had published and then a year later it turns out that supernova Ia were not standard candles and there is large scale evolution of SN Ia or if it turned out that dark flows were much stronger than expected, it would have been a "merely interesting" paper but not worth a Nobel.

Now as it happens subsequent experiments have tightened error bars, and WMAP shows consistent CMB evidence.

Also note that Perlmutter went in with a hundred assumptions. He was trying to measure the deceleration parameter, which was expected to be positive.

If it does, show me the proof (your pedagree and lunch room conversations will not suffice).

I'm an astrophysicist, not a mathematician. I don't deal with proof. I can't "prove" that the universe is accelerating any more than I can "prove" that the Earth is round.

No single scientific observation is "proof" of anything. You have to view observations as fitting within a model, and right now the idea that universe is accelerating is the best model that people have come up with, and the fact that people have tried really hard and failed to come up with alternative models should tell us something.

I can't *prove* that there isn't a simple explanation that explains everything. I can show that people have tried and failed to come up with an alternative explanations, and the most obvious model right now is pretty darn interesting.

I can't "prove" something is true. I can "prove" something is false, and Perlmutter kills CDM with Lambda=0, which was the standard cosmological model in 1995. The simplest theoretical patch is to assume that Lambda > 0.

Likewise, Perlmutter et al should get the prize for what they actually did, i.e., obtain distance moduli for large redshift supernovae, not for what they inferred from that data, i.e., accelerated expansion. .

Very strongly disagree. If Perlmutter et al. just got distance moduli for large redshift supernovae and got what people expected, that wouldn't be worth a Nobel. They got the Nobel because they did the observations and got results that *no one* expected. It's like Columbus discovering America. If everything had gone according to plan, and Columbus really ended in Japan, that would be "merely interesting." But he went in looking for Japan and ended up hitting something unexpected. Same with Perlmutter. We were expecting q=1. We got q=-0.6.

Columbus gets credit for "discovering" America, even though he went to his grave thinking that he was in Japan. Hubble gets credit figuring out that galaxies are moving away from each other even though his calibrations turn out to be wrong.

The revolutionary part was that Perlmutter came up with numbers that cannot bet explained without really weird stuff happening. Even if it turns out that the universe is not accelerating, the way that we thought the universe worked in 1997 just will not work with his observations.
 
Last edited:
  • #36
twofish-quant said:
That just doesn't make any sense to me. The supernova data is just one piece of the puzzle and has to be understood in the context of other data. In order to do anything with the supernova data, you have to make hundreds of assumptions, and there is no way that you can justify those assumptions without reference to other data. If you had only the supernova data, then you could come up with a lot of alternative explanations.

Thank you for conceding my point.

twofish-quant said:
It turns out that none of those observations work in light of other data. If Perlmutter had published and then a year later it turns out that supernova Ia were not standard candles and there is large scale evolution of SN Ia or if it turned out that dark flows were much stronger than expected, it would have been a "merely interesting" paper but not worth a Nobel.

Now as it happens subsequent experiments have tightened error bars, and WMAP shows consistent CMB evidence.

Also note that Perlmutter went in with a hundred assumptions. He was trying to measure the deceleration parameter, which was expected to be positive.

As long as you confine yourself to unmodified GR, I would agree that the alternative explanations do not suffice in light of WMAP.

twofish-quant said:
I'm an astrophysicist, not a mathematician. I don't deal with proof. I can't "prove" that the universe is accelerating any more than I can "prove" that the Earth is round.

No single scientific observation is "proof" of anything. You have to view observations as fitting within a model, and right now the idea that universe is accelerating is the best model that people have come up with, and the fact that people have tried really hard and failed to come up with alternative models should tell us something.

I can't *prove* that there isn't a simple explanation that explains everything. I can show that people have tried and failed to come up with an alternative explanations, and the most obvious model right now is pretty darn interesting.

I can't "prove" something is true. I can "prove" something is false, and Perlmutter kills CDM with Lambda=0, which was the standard cosmological model in 1995. The simplest theoretical patch is to assume that Lambda > 0.

Well said.

twofish-quant said:
Very strongly disagree. If Perlmutter et al. just got distance moduli for large redshift supernovae and got what people expected, that wouldn't be worth a Nobel. They got the Nobel because they did the observations and got results that *no one* expected.

Of course it's a value judgment, but I think getting that data is extremely valuable even if it had shown what we expected.


twofish-quant said:
The revolutionary part was that Perlmutter came up with numbers that cannot bet explained without really weird stuff happening. Even if it turns out that the universe is not accelerating, the way that we thought the universe worked in 1997 just will not work with his observations.

Agreed, so again, give him the Nobel for what he did, not some particular inference.
 
  • #37
You're arguing semantics or something Ruta. The observations by perlmutter & others leads directly to the conclusion of an accelerating expansion. Quoted from nobelprize.org: "The Nobel Prize in Physics 2011 was divided, one half awarded to Saul Perlmutter, the other half jointly to Brian P. Schmidt and Adam G. Riess for the discovery of the accelerating expansion of the Universe through observations of distant supernovae".

I'm going to side with the Nobel Prize commitee on this one.
 
  • #38
RUTA said:
As long as you confine yourself to unmodified GR, I would agree that the alternative explanations do not suffice in light of WMAP.

I don't know of any modifications of GR that will let you avoid an accelerating universe (Whitshire model claims not to modify GR). The modified theories of gravity that I'm aware of, namely the f(R) models, attempt to explain acceleration without invoking dark energy, but the universe is still accelerating. The problem is that the sign is wrong. Any modified gravity model would be expected to be close to GR at short distances and different at far distances. However, the observations show maximum acceleration at short distances and lower acceleration at long distances. So if you are trying to show that the acceleration isn't real, then you have to modify gravity the most at short distances which then runs into problems in that we have a lot of data that suggests that GR works at short distances.

If you want to modify GR to explain how the acceleration came about, that's not hard, and there is an entire industry devoted to that, and hundreds of papers on that topic. If you want to modify GR to argue that the acceleration doesn't exist, that's really, really, really hard, and I don't know of anyone that has been able to do that.

The other thing is that the supernova data has changed the definition of GR. In 1995, if you asked people to write down the equations of GR, they would have written it with the cosmological constant = 0. Einstein called the cosmological constant his biggest mistake, and for sixty some years people agreed with that. Today, "unmodified GR" has a non-zero cosmological constant.

Something that is very important about the data is that the signal is huge. If he found that the q=-0.01 or even q=-0.1, then you can come up without much difficult with reasons why the universe may not be accelerating, and that this whole this is just some misinterpretation of the data. As it is, q=-0.6 which is way, way bigger than anything that people have been able to come up with.
 
  • #39
twofish-quant said:
I don't know of any modifications of GR that will let you avoid an accelerating universe (Whitshire model claims not to modify GR). The modified theories of gravity that I'm aware of, namely the f(R) models, attempt to explain acceleration without invoking dark energy, but the universe is still accelerating. The problem is that the sign is wrong. Any modified gravity model would be expected to be close to GR at short distances and different at far distances. However, the observations show maximum acceleration at short distances and lower acceleration at long distances. So if you are trying to show that the acceleration isn't real, then you have to modify gravity the most at short distances which then runs into problems in that we have a lot of data that suggests that GR works at short distances.

If you want to modify GR to explain how the acceleration came about, that's not hard, and there is an entire industry devoted to that, and hundreds of papers on that topic. If you want to modify GR to argue that the acceleration doesn't exist, that's really, really, really hard, and I don't know of anyone that has been able to do that.

The other thing is that the supernova data has changed the definition of GR. In 1995, if you asked people to write down the equations of GR, they would have written it with the cosmological constant = 0. Einstein called the cosmological constant his biggest mistake, and for sixty some years people agreed with that. Today, "unmodified GR" has a non-zero cosmological constant.

Something that is very important about the data is that the signal is huge. If he found that the q=-0.01 or even q=-0.1, then you can come up without much difficult with reasons why the universe may not be accelerating, and that this whole this is just some misinterpretation of the data. As it is, q=-0.6 which is way, way bigger than anything that people have been able to come up with.

I agree. My point was simply that if there is any hope of doing away with accelerated expansion, I think it's safe to say at this point it would have to deviate from GR cosmology.
 
  • #40
Drakkith said:
You're arguing semantics or something Ruta. The observations by perlmutter & others leads directly to the conclusion of an accelerating expansion. Quoted from nobelprize.org: "The Nobel Prize in Physics 2011 was divided, one half awarded to Saul Perlmutter, the other half jointly to Brian P. Schmidt and Adam G. Riess for the discovery of the accelerating expansion of the Universe through observations of distant supernovae".

I'm going to side with the Nobel Prize commitee on this one.

Once you have the understanding explicated by twofish, you're good to go, i.e., you know what was actually done and whence the conclusion of accelerated expansion. However, if you're a layperson who hasn't been exposed to a discussion as in this thread, you might well conclude the supernova data constituted a direct measurement of acceleration, e.g., velocity as a function of time as in drag racing. Now you realize that the supernova data in and of itself does not necessitate accelerated expansion, but must be combined with the assumption of GR cosmology and other data. The conclusion of accelerated expansion follows from a robust set of assumptions and data, but the Nobel recipients were not responsible for that entire set, only the supernova data. As evidence of the potential for confusion caused by statements such as found in the Nobel prize citation, just look at how this thread started.
 
  • #41
RUTA said:
I agree. My point was simply that if there is any hope of doing away with accelerated expansion, I think it's safe to say at this point it would have to deviate from GR cosmology.

Much, much more serious than that. The q acceleration parameter doesn't assume anything about the gravity law. The only assumption is that the universe is at large scales isotropic and homogenous. Any theory of gravity that is isotropic and homogenous at large scales (GR or no) is not going make a difference.

In fact at small scales, the universe isn't isotropic and homogenous, which is why the first calculation that people did was to see what the impact isotropic and inhomogenity would have. Right now the only think that would kill the results are essentially data related issues (i.e. we really aren't measuring what we think we are or massive underestimates in anisotropy and inhomogenity in the near region).
 
  • #42
RUTA said:
However, if you're a layperson who hasn't been exposed to a discussion as in this thread, you might well conclude the supernova data constituted a direct measurement of acceleration, e.g., velocity as a function of time as in drag racing.

Well it is.

The measurements of supernova acceleration are no less "direct" than any other scientific measurement. If you try to measure the speed of a speeding car, you are bouncing radar waves off the car or doing some other processing.

Yes, you can misinterpret the results, but this is no worse than any other measurement, and the reason I'm hammering on this issue is that the supernova measurement *AREN'T* any less direct than a traffic cop using a radar device to track speeders.

Now you realize that the supernova data in and of itself does not necessitate accelerated expansion, but must be combined with the assumption of GR cosmology and other data.

Except you *DON'T* have to assume GR cosmology. You *do* have to make some assumptions, but those assumptions are no worse than those that you have to make if you try to measure velocity with a radar gun. If you try to measure the speed of a drag racer with a radar run, you have to assume that the speed of light is a particular value, certain things about Doppler shift, etc. etc.

The *ONLY* reason that people started questioning the assumptions of the measurements to the extent that they did was that results were so weird. If you clock a drag racer going 0.1c with your radar gun, your first reaction is going to be that your radar gun is broken.

The conclusion of accelerated expansion follows from a robust set of assumptions and data, but the Nobel recipients were not responsible for that entire set, only the supernova data.

So what?

And they weren't responsible for all of the supernova data. In fact both data was gathered with teams of dozens of people, and practically the entire astronomy community was involved in checking and cross-checking the results. The reason I got to listen to the conversations is that my adviser happens to be one of the world's foremost experts on supernova Ia, and the other person in the room was an accretion disk effort that dabbles in cosmology, and we were trying to figure out whether or not the !a evolution or inhomogenity would kill the results.

The Nobel prize does contribute to the misconception of the lone scientific genius, but that's another issue. I suspect one reason I'm getting emotional about this particular issue is that I was involved in figuring out what was going on, and in some sense when they gave the Nobel to the supernova researchers, they were also giving it to me and several thousand other people that were involved at putting together the results.

As evidence of the potential for confusion caused by statements such as found in the Nobel prize citation, just look at how this thread started.

The thread started when you had someone that was simply unaware of what data existed. Also one reason I think that the Nobel prize wording is correct is that sometimes there is too much skepticism. The supernova results are extremely solid and hedging on the wording suggests that there is reasonable room for scholarly debate as to whether or not they are measuring acceleration when in fact there isn't.

This matters less for supernova, but it matters a lot for evolution and global warming. There is no reasonable scholarly debate as to whether or not evolution happens or that global warming is happening because of CO2 input. A lot of debate on the details, and one nasty political tactic is to take debate on the details as people disagreeing with the main premise.

In any case, you aren't going to "less confusion" because no matter how you teach your students, once they leave your lectures and attend mine, they are going to hear me very strongly contradicting your choice of wording.
 
  • #43
I think you are misinterpreting my position. You asked for *proof* that the universe is expanding, and by that I interpret as mathematical proof and you can't prove physical results mathematically, but you should be asking for levels of evidence. Let's use legal terminology.

The Perlmutter results and everything known as of 1998, would in my opinion establish that the universe is expanding by the preponderance of the evidence. What that means that if you had a civil court case in which a $1 billion dollars would change hands if the universe is accelerating then I'd vote guilty but based on the evidence in 1998, I would not vote to sentence a person to death based on the evidence in 1998 since that requires "proof beyond a reasonable doubt." Today, I think that the evidence does establish legal proof beyond a reasonable doubt, so if I was on jury in which someone was subject to the death penalty based on evidence of the accelerating universe, I'd vote to convict.

Now in order to get from raw data to inference, you have to run through several dozen steps, and there is the possibility of human error and misinterpretation at each step. One important thing about the original results were that they were released by two separate teams, which is important. The results are so weird, that if only one team had released them the reaction of most people is that they just messed up somewhere. Maybe there was interference from a television set. Maybe someone didn't take into account the motion of the earth. Maybe someone cleaned the telescope without the team knowing. (These aren't hypotheticals, they've actually happened.) Once you have two separate teams with different telescopes, different computers, different algorithms, different people come up with the same answer, then a lot of the alternative explanations disappear. It's not a computer bug.

Now suppose we have a magic oracle tell us that the got everything to the distance modulus versus redshift right, and let's suppose that we have this magic oracle tell us that we are in fact measuring lookback time versus velocity. At that point by definition, *something must be accelerating*. If the redshifts are actually due to velocity, and the distance modulus translates to look back, we see the velocities decrease over time, and that's the definition of acceleration. The question is "what is accelerating"? It could be the Earth or it could be the galaxies. It could also be the Hubble flow or it could be something else.

Based on the best numbers in 1998, non-Hubble acceleration of the Earth can only account for about a third of the signal. Now maybe those numbers are wrong, but people have checked since then, and the limits haven't changed. There are also some other possibilities. The big one in my mind is since we do not know exactly what causes SNIa, there is the possibility that SNIa might change radically over time. However, for this to influence the results, you would have to have SNIa evolution that has never been observed. Still it's a possible hole, and that hole has been filled as we have distance measures that have nothing to do with SNIa which show the same results.

Note here that I've not mentioned GR or dark matter or anything "esoteric." That's because none of that influences the results. The point that I'm making is that the SN results are as "direct" a measurement as you can make, and there is no more room for model related skepticism than there is for mistrusting your GPS.
 
  • #44
Again, I'm not questioning the data, but to say that it constitutes a direct measurement of acceleration in the same sense as in measuring objects moving on the street outside my house is wrong. There are assumptions one has to make with cosmology that one does not have to make with observations here on Earth. For example, you say cosmological z produces a velocity, but you can only make that connection in the context of a cosmology model. I don't have nearly that degree of uncertainty with rendering a Doppler z for race cars. In fact, I can use people making spatiotemorally local measurements on the car to get velocity. A cosmology model is also required to turn mu into distance. I don't have nearly that degree of uncertainty in knowing how far the race car is going down the road, I can measure that distance directly by actually traversing the spatial region personally.

I'm sure you feel the assumptions you are making are reasonable. Brahe and proponents of geocentricism also thought their assumptions were reasonable. When it comes to announcements concerning cosmology, I think we better stick as closely as possible to what we actually observe and distinguish clearly between those observations and model-dependent inferences.
 
  • #45
RUTA said:
Again, I'm not questioning the data

You should.

but to say that it constitutes a direct measurement of acceleration in the same sense as in measuring objects moving on the street outside my house is wrong.

One important point here was that the important quantity that people were trying to measure was q, which is different from measuring the acceleration of an individual galaxy. If you had an oracle that gave you the acceleration of each galaxy, you'd still have to strip out the peculiar acceleration of each galaxy to get the to Hubble flow.

What people were trying to find was

q = (speed galaxy 1 - speed galaxy 2) / (time galaxy 1 - time galaxy 2) averaged over galaxies at time 1 and time 2. That's different from the rate of change of a specific galaxy, which you can't get. However, even if you could get it, you'd still have to do statistics to get to the number you are really interested in.

There are assumptions one has to make with cosmology that one does not have to make with observations here on Earth. For example, you say cosmological z produces a velocity, but you can only make that connection in the context of a cosmology model.

This is wrong. It's just Doppler shift. There are some corrections that are model dependent, but these aren't large, and can be ignored if you are getting rough numbers. You *do* have to make some assumptions (i.e. the shift comes from velocity and not from gravity), but those assumptions can be cross checked, and are independent of the cosmological model.

I don't have nearly that degree of uncertainty with rendering a Doppler z for race cars. In fact, I can use people making spatiotemorally local measurements on the car to get velocity

You really should be more skeptical of local measurements.

All your measurements are still indirect in the since that you are making assumptions in order to get the numbers. If you are looking at the car in front of you, you are still interact with the car using various forces and fields. And it turns out that you can get things wrong. I've had to deal with broken speedometers.

A cosmology model is also required to turn mu into distance.

A model is necessary to turn mu into distance, but it's not a cosmological model. There are assumptions that you have to make in order to turn mu into distance, but those are not cosmological.

I'm sure you feel the assumptions you are making are reasonable.

That's funny because I'm not. Don't assume. What you do is to show that number x comes out of calculation a, b, c, and d. You then go back to each item and see if a, b, c, and d can be justified with observational data. Also you ask yourself, suppose we are wrong with assumption A. What would be the impact of being wrong.

The issue with the acceleration universe is that if turns out that the signal is so large, that if you kill some major assumptions, then it doesn't matter. OK, let's assume that GR is wrong. Does that change the conclusion. No, it doesn't. Let's assume that we add a large amount of dark flow, does that change the conclusion. No it doesn't.

Brahe and proponents of geocentricism also thought their assumptions were reasonable.

And it turns out that the Tychoian system works just as well as the Coprehnican system, since they are mathematically identical.

In any case, I'm jumping up and down, because this is precisely what people are *NOT* doing. You go through the results and see how the conclusions are impacted by the assumptions. You'll find that a lot of the assumptions don't make a difference. If it turns out that the big bang never happened and we are in a steady state universe, then it doesn't impact the results.

You then have a list of things that might impact the results, and then you go back and double check those results.

When it comes to announcements concerning cosmology, I think we better stick as closely as possible to what we actually observe and distinguish clearly between those observations and model-dependent inferences.

You are observing raw CCD measurements. The big question marks are in deriving Z and distance.

The point that I'm making that the results are nowhere as model dependent as you seem to think they are. You don't have to assume GR. You don't have to assume any particular cosmological model. You *do* have to make some assumptions, and the original papers did a good job at listing all of them. The thing about Perlmutter is that the observation is so "in your face" that when you ask what happens if you make different assumptions, then the signal just does not disappear.

Also, as a matter of observation, you *can't* distinguish clearly between observations and inferences. You aren't getting a z-meter. You have CCD readings. To get them into z, requires two or three dozen steps, each of them contain assumptions. Some of those assumptions are things that are "obvious" (i.e. you have to subtract out the motion of the earth), but it turns about that you have to keep a list because they could give you spurtious results if you do them wrong.

Look the problem that I have is when skepticism which is healthy becomes "stick your head in the sand and ignore reality" which very unhealthy. For supernova data, this doesn't matter that much, but you see the same thing with climate models and evolution which does matter a lot. The point that I'm making is that the supernova results *DO NOT DEPEND ON THE SPECIFIC COSMOLOGICAL MODEL*. The signal is just too strong.
 
Last edited:
  • #46
The reason I'm hitting on this issue is that people tend to think of cosmology as a form of philosophy which is bad because cosmology is an *observational* science. We have a ton of data that is coming in from all sorts of instruments, and once you have a ton of data, then the room for "random speculation" goes down.

Once you get to the point where you are measuring distance modulus and redshifts, and once you have convinced yourself that the redshift is in fact a velocity (which has nothing to do with your cosmological model), then the only assumption that you need to make to get to the accelerating universe is that of homogenity and isotropy. The way you deal with this is the same way you deal with the round earth. You can do a rough calculation assuming the Earth is perfectly spherical, and then you figure out how much the known differences from perfectly spherical shape change your result.

You do the same for cosmology. You do a calculation assuming that the universe is perfectly homogenous and isotropic, you get a number. You then put in the known deviations from perfect smoothness, the result that get is that those deviations can change things by q=0.3 at most, which still gets you an accelerating universe. You calculate how much of a deviation from smoothness will kill the result, and it turns out that this is excluded.

Note that all of this is based on *data*. The reason that I'm hammering on this issue is that you don't want to give the public the idea that things are more uncertain than they are. The types of uncertainties that you get from cosmological measurements are the same types of uncertainties you get measuring anything else, and they are no more "indirect" than measuring speed with a Doppler radar. You verify correctness the same way we do with any Earth based measurement which is to have multiple independent methods and see if they give you the same answer, and we have the data to do this.
 
  • #47
To fit mu vs z (data) using GR cosmology you find the proper distance (Dp) as a function of redshift z using your choice of a particular model, which gives you the scale factor as a function of time a(t), and 1/(1+z) = a(te) where te is time of emission (this assumes current a(to) = 1, which you're free to choose). This form of redshift is independent of your choice of GR cosmology (but not independent of your choice of cosmology model in general) and it is a redshift (not blueshift) if the scale of the universe is larger now than at emission regardless of the relative velocites of emitter and receiver at emission or reception, i.e., this is NOT a Doppler redshift. Next you have to convert Dp to luminosity distance (DL), which is a model-dep relationship, then DL gives you mu. As you can see, this fit is highly dependent on your choice of cosmology model. If it wasn't, how could the data discriminate between the models?

And, think about it, the model that best fits this data tells us the universe was first decelerating and then changed to acceleration. If you believe that's true and you also (erroneously) believe the data gives you time rate of change of velocity directly and independent of model, then you would see the acceleration in nearby (small z) galaxies and deceleration in the most distant galaxies (large z). But if that was true, we would've known about the accelerated expansion all along, the large z would only be used to find the turning point between acceleration and deceleration. But that's not what happened, decelerating models fit the small z fine and no one suspected accelerated expansion. It's the large z that keeps us from fitting a decelerating model.

I'll stop here and let you respond. I'm on the road and posting is non-trivial so I apologize for the terse nature of this response.
 
  • #48
RUTA said:
Next you have to convert Dp to luminosity distance (DL), which is a model-dep relationship, then DL gives you mu. As you can see, this fit is highly dependent on your choice of cosmology model.

But my point is that the model dependence does not affect the conclusion. You can assume that GR is wrong and then create an alternative theory of gravity by changing the parameters in the equations. GR may be wrong but we have enough observations so that we know that GR is a good enough approximation to within some error.

Once you do that you quickly figure out that the signal is strong enough so that if you put in any plausible non-GR model (i.e. one that isn't excluded by other observations), you still end up with acceleration. The only way out is if you assume that there is some effect which is not taken into account by your parameterization. If you just use a single scale factor then you aren't taking into account dark flows and voids. Once you take those into account, you still can get rid of the signal without some very tricky arguments. At that point you try to think of anything else that you may have missed, and after ten years of doing that, you start to think that you didn't miss anything.

What you can do is to make list all of the assumptions that go into the conclusion. We don't know if GR is correct, but we know that the "real model of gravity" looks like GR within certain limits. You then vary the gravity model within the known observational limits, and it turns out that it doesn't make that much difference. There are other parts of the problem that make a lot more difference than gravity model (like the assumption that SN Ia are standard candles).

If your point is that the supernova measurements cannot be interpreted without reference to other data, sure. But that's true with *any* measurement. I have a GPS device. That device gives me my position, but it turns out that there are as many if not more assumptions in the GPS result than in the supernova measurement. If I assume GR is wrong and Newtonian gravity is correct, it turns out that this doesn't change my conclusions w.r.t. supernova. However it does change my GPS results.

if the scale of the universe is larger now than at emission regardless of the relative velocites of emitter and receiver at emission or reception, i.e., this is NOT a Doppler redshift.

Then we get into another semantic argument as to "what is a Doppler redshift?" and then "what is an acceleration?". You can do a lot of cosmology with Newtonian gravity. It's wrong but it's more intuitive. There is a correspondence between the term acceleration in the Newtonian picture and the GR picture. Similarity there is a correspondence between the concept of "Doppler shift" in Newtonian cosmology and that of GR.

The reason that these correspondences are important is that it let's you take observations that are done in "Newtonian" language and then figure out what it means in GR language. Sometimes semantics are important. If you do precision measurements, then you have to very clearly define what you mean by "distance", "brightness", and "acceleration."

But in this situation it doesn't matter. You use any theory of gravity that isn't excluded by observation and any definition of acceleration that you want, and you still end up with a positive result.

And, think about it, the model that best fits this data tells us the universe was first decelerating and then changed to acceleration. If you believe that's true

That doesn't make sense to me. In order to get scientific measurements, you have to make assumptions, and it's important to know what assumptions you are making, and to minimize those assumptions.

In order to do the conversion from brightness distance to some other distance, you have to make assumptions about the theory of gravity from distance 0 to the location that you are looking at. You *don't* have to make any assumptions about anything more distant.

Now it turns out that if you assume that you have a cosmological constant, you get a nice fit, but it's really, really important point that this assumption was not used to get the to the conclusion that the universe is accelerating. This matters because in order to interpret the supernova results, you have observations that limit what you can do to gravity. Now if you go into the early universe, you can (and people do) make up all sorts of weird gravity models.

It's important to keep things straight here, to make sure that you aren't doing any sort of circular reasoning. If it turns out that assuming GR is correct was critical to getting the conclusions we are getting, then that is a problem because things get circular.

You also (erroneously) believe the data gives you time rate of change of velocity directly and independent of model

The observers were measuring q. You get q by performing mathematical operations on the data. Now what q means, is something else. Within the limit of models of the universe that are not excluded by other observations, the observed q=-0.6 means that you have an accelerating universe.

I'm asserting is that for these particular results, gravity model dependence doesn't introduce enough uncertainty to invalidate the conclusions. The model *does* have influence the numbers, but whether or not that matters is another issue. For the supernova situation, the model dependencies aren't enough to allow for non-acceleration.

What I'm asserting is that if you plot the results on a graph and then include all possible values for the acceleration/deceleration of the universe, then anything with non-acceleration is excluded.

But if that was true, we would've known about the accelerated expansion all along, the large z would only be used to find the turning point between acceleration and deceleration.

No we wouldn't because for anything out past a certain distance we don't have any good independent means of measuring distance other than redshift. For anything over z=1, all we have is z, and there isn't any independent way of turning that into a distance. We can make some guesses based on things that have nothing to do with direct measurements, but unlike the supernova measurements those are just guesses that could very easily be wrong.

Also this matters because the assertion that the universe is decelerating at early times and that this deceleration turned into an acceleration *is* heavily model dependent. If we've gotten our gravity models wrong, then most of the evidence that indicates that the universe is decelerating at early times just evaporates. Now people are extending the supernova data to regions where we should see the universe decelerating (interesting things with GRB's).

I suppose that's one more reason to make the distinction between what we "know" and what we are guessing. Up to z=1, we know. We might be wrong but we know. For z=7, we are guessing.

This is also where the gravity models come in. For z=1, you look at the list of gravity models that are not excluded by observation, and the impact of the gravity model turns out not to be important. There is an impact, but it doesn't kill your conclusions. At z=5, then it does make a huge difference.

But that's not what happened, decelerating models fit the small z fine and no one suspected accelerated expansion. It's the large z that keeps us from fitting a decelerating model.

Decelerating models fit small Z (z<0.1) fine. Accelerating models also fit small Z (z<0.1) fine. The problem is that before we had supernova, we had no way of converting between z and "distance". We do now.

I'll stop here and let you respond. I'm on the road and posting is non-trivial so I apologize for the terse nature of this response.

The problem is the dividing line between what we "know" and what we are guessing. If we were talking about using the WMAP results to infer the early expansion of the universe, then we are guessing. In order to go from WMAP to expansion rate, we have to make a lot of assumptions, and those assumptions are not constrained by data. We get nice fits if we assume GR, but GR could be wrong, and the for z=10, the possibility exists that we've gotten gravity wrong is enough so that it could totally invalidate our conclusions.

I'm trying to argue that this is not the situation with supernova data.
 
  • #49
One other thing is that may be a little confusing is that both Perlmutter and Reiss reported their results in the language of GR. That doesn't mean that GR is essential for their results to be correct, any more than the fact that they used earth-centered coordinates to report the positions of their objects means that they think that the Earth is in the middle of the universe. It so happens that those are the most convenient coordinates to report your results in. We don't know if GR is correct. We do know that the "real theory of gravity" looks a lot like GR at short distances.

So the way that I'd read the Riess and Perlmutter results as being "here are the numbers that you get if GR were correct". Now if you have another model of gravity, you can "translate" those numbers into "effective omega." At that point the theorists go crazy and see if they can come up with a model of gravity that matches those numbers. You run your new model of gravity and then calculate that "omega" in your model means the same thing as "omega effective" in GR, and that makes it easy to describe 1) how different your model is from GR and 2) how well your results match those the SN results.

What happens when you try this exercise is that you find that it's hard enough to much the supernova data with alternative gravity that gives different results that most theorists have even up trying. You can come up with lots of theories of alternative gravity, but everyone that I know of ends up concluding that the universe is accelerating at z < 1.0, and the name of the game right now is to come up with models that give results that match GR at "short" distances where we have a ton of data, and which might be very different at long distances where you can make up anything you want because we don't have strong data. Everything that's been said about the supernova data I agree if we were talking about WMAP because we don't have "direct" measurements of expansion, and everything is very heavily model dependent.

But my point is that even though they use a particular model to describe their results, it turns out that their results are not sensitively dependent on their model. They use geocentric coordinates to identify the objects that they are observing, and they assuming Newtonian gravity to describe brightnesses, but those are merely convenient coordinate systems and you can see what happens if you use a different model and "translate" the results. It turns out that it doesn't make a difference.

The reason I'm jumping up and down is that it turns out that the supernova results *aren't* sensitive to gravity models. Which is different from the situation once you go outside of the SN results.
 
  • #50
What I mean by "expansion rate" is given by the scale factor in GR, i.e., a(t). This is responsible for the deceleration parameter q and the Hubble "constant" H in GR cosmology. If, as is done with SN, one produces luminosity distance as a function of redshift and I want to know whether or not that indicates accelerated expansion, I have to find the GR model that best fits the data and the a(t) for that model then tells me whether the universe is accelerating or decelerating. A GR model that produces a good fit to the SN data is the flat, matter-dominated model with a cosmological constant, and a(t) in this model says the universe was originally decelerating and is now accelerating.

You're claiming (?) that I can skip the choice of cosmology model and render a definition of expansion rate in terms of ... luminosity distance and redshift directly? Ok, suppose you do that and claim the universe is undergoing accelerated expansion per your definition thereof. I'm willing to grant you that and concede you have a direct measurement of acceleration by definition. I'm not willing to grant you that it's a model-independent result. You've tacitly chosen a model via your particular definition of "acceleration" that involves luminosity distance and redshift.

Because, again, what GR textbooks define as q involves a(t), so someone could convert your luminosity distances and redshifts to proper distances versus cosmological redshifts in their cosmology model (as in GR cosmology) and obtain a resulting best fit model for which a(t) says the universe isn't accelerating. Thus, I have one model telling me the universe is accelerating and one that says it's decelerating, i.e., the claim is dependent on your choice of cosmology model.

Note by the way that, given your direct measurement of accelerated expansion, this ambiguity doesn't merely arise in some "crazy" or "inconceivable" set of circumstances. If we only had small z data and you employed your kinematics, you would conclude the universe is accelerating. However, the flat, matter-dominated GR model without a cosmological constant is a decelerating model that fits luminosity distance vs z data nicely for small z. Therefore, we would need the large z data to discriminate between the opposing kinematical conclusions.

Therefore, I disagree with your claim that your definition of acceleration puts your SN kinematical results on par with terrestrial physics. I do not run into an ambiguity with the definition of acceleration in intro physics.
 

Similar threads

Replies
7
Views
2K
Replies
12
Views
2K
Replies
2
Views
1K
Replies
13
Views
2K
Replies
1
Views
1K
Replies
12
Views
2K
Back
Top