hedgehug
- 30
- 1
@Ibix Note that I only really need ##a(t)/a(t_0)=1/(z+1)##, because ##t_0=t_{rec}## and ##t=t_{emit}##.
So by "normalized" you mean "less than or equal to 1"? Where are you getting that definition from?hedgehug said:The ratio ##a(t)/a_0## is normalized, because ##1/(z+1)\le 1##.
I still don't see how this is "normalization". See above.hedgehug said:in my opinion it's practically like normalizing the scale factor itself. This time even literally and explicitly.
What is "equal" about two results that differ numerically by a factor ##\left( 1100 + 1 \right)##?hedgehug said:Equal.
This is only true if ##t \le t_0##. What if you're trying to compute something for a time ##t## that's later than now (##t_0##)?hedgehug said:the ratio ##a(t)/a(t_0)\le 1##
If this were true, no ##t## other than ##t_0## would have any meaning and this whole discussion would be pointless.hedgehug said:There has never been and there will never be other time for anyone
Physical, proper distances are equal after substituting Ly/1101 for NLy.PeterDonis said:What is "equal" about two results that differ numerically by a factor ##\left( 1100 + 1 \right)##?
I think you just need to have a last word. When I said that there has never been and there never will be other time for anyone, I meant the time when you are alive.PeterDonis said:If this were true, no ##t## other than ##t_0## would have any meaning and this whole discussion would be pointless.
I think you have not thought through your position very carefully.
Yes, but we still expect our physical models to cover other times besides those when we are alive. And that includes times to our future as well as to our past. For example, suppose we wanted to compute the radius of the observable universe five billion years from now, when the Sun is a red giant.hedgehug said:When I said that there has never been and there never will be other time for anyone, I meant the time when you are alive.
##a(t_{emit})/a(t_{rec})=1/(z+1)\le 1## for ##t_{emit}\le t_0## and ##t_{rec}>t_0## in the expanding universe.PeterDonis said:This is only true if ##t \le t_0##. What if you're trying to compute something for a time ##t## that's later than now (##t_0##)?
Bad habit from programming. Me and everyone I worked with have been calling a variable divided by its maximum value normalized.PeterDonis said:So by "normalized" you mean "less than or equal to 1"? Where are you getting that definition from?
So now you're changing the definition of what variable you're using? Or did you really mean ##t_{rec}## before when you wrote ##t_0##?hedgehug said:##a(t_{emit})/a(t_{rec})=1/(z+1)\le 1##
You've been using ##t_0## all through this thread up until post #31, when you used ##t_{rec}##, and that was in response to @Ibix using it, and then post #39. That's a change. (And for that matter, you didn't use ##t_{emit}## until post #31 either; your OP in this thread gave an integral that starts at the Big Bang, ##t = 0##, which isn't the "time of emission" of anything.)hedgehug said:I'm changing nothing.
Even if I accept this, the variable here is not the scale factor but the ratio of scale factors, and it only has a "maximum value" of ##1## because we put the earlier time in the numerator and because we insist on moving the time in the denominator to be the latest time we're considering. If we flip the ratio around (which makes more sense since then we're basically just looking at the redshift), it has no maximum value.hedgehug said:Me and everyone I worked with have been calling a variable divided by its maximum value normalized.
Correct, for the expanding universe.PeterDonis said:Even if I accept this, the variable here is not the scale factor but the ratio of scale factors, and it only has a "maximum value" of
because we put the earlier time in the numerator and because we insist on moving the time in the denominator to be the latest time we're considering. If we flip the ratio around (which makes more sense since then we're basically just looking at the redshift), it has no maximum value.
But the scale factor does not have to be normalized to the proper distance; that's the point. It doesn't matter how the scale factor is defined.hedgehug said:What really asserts the scale factor normalization, is the proper distance calculation
That doesn't mean it's normalized; it means it's irrelevant to the proper distance calculation because it cancels out.hedgehug said:it cancels out after the integration
That's the point. The formula for the proper distance always gives the same result as for the normalized scale factor, because it cancels out its not normalized value ##a(t_0)##.PeterDonis said:But the scale factor does not have to be normalized to the proper distance; that's the point. It doesn't matter how the scale factor is defined.
That doesn't mean it's normalized; it means it's irrelevant to the proper distance calculation because it cancels out.
I don't agree with you calling the definition ##a(t_0) = 1## "normalized", which is what you're doing here. It's a convenient convention, that's all. It's not normalized to anything.hedgehug said:the normalized scale factor,