# Data on time for fading of galaxy?

• B
• Einstein's Cat
In summary, galaxies that are further than c/H metres from us, have recessional velocity exceeding c and thus they begin to fade away.f

#### Einstein's Cat

Galaxies that are further than c/H metres from us, have recessional velocity exceeding c and thus they begin to fade away.

Is there any data for how long it takes galaxies to fade away from the perspective of an observer on Earth where when t is zero, the recessional velocity of the galaxy is c?

I am unable to locate any such data on- line: also if any data doesn't exist; does other data that is similar to what I'm asking exist?

Galaxies that are further than c/H metres from us, have recessional velocity exceeding c and thus they begin to fade away.

Is there any data for how long it takes galaxies to fade away from the perspective of an observer on Earth where when t is zero, the recessional velocity of the galaxy is c?

I am unable to locate any such data on- line: also if any data doesn't exist; does other data that is similar to what I'm asking exist?

Given that the scale will be asymptotic and "fading" is very ill defined (subjective) I doubt that there IS anything like a solid answer to your question. Also you have not even defined "observer". Human eyes? With binoculars? With telescope? Space-based telescope? What ?

Galaxies that are further than c/H metres from us, have recessional velocity exceeding c and thus they begin to fade away.
Their fading has nothing to do with their recession velocity being greater than c. It's primarily a matter of two features:
1. Objects that are further away appear dimmer.
2. As the universe expands, redshift increases, which both dims and reddens the light coming from far-away galaxies.

In the far future, the scale factor will change as:

$$a(t) = a(t=0) e^{H_\Lambda t}$$

...where $H_\Lambda$ is the expansion rate from the cosmological constant alone, or $H_\Lambda = c \sqrt{\Lambda /3}$.

What this means is that roughly every 13 billion years, the scale factor will double, and so will the redshift, which cuts the energy of the incoming radiation from the object by a factor of $2^4 = 16$.

Their fading has nothing to do with their recession velocity being greater than c. It's primarily a matter of two features:
1. Objects that are further away appear dimmer.
2. As the universe expands, redshift increases, which both dims and reddens the light coming from far-away galaxies.

In the far future, the scale factor will change as:

$$a(t) = a(t=0) e^{H_\Lambda t}$$

...where $H_\Lambda$ is the expansion rate from the cosmological constant alone, or $H_\Lambda = c \sqrt{\Lambda /3}$.

What this means is that roughly every 13 billion years, the scale factor will double, and so will the redshift, which cuts the energy of the incoming radiation from the object by a factor of $2^4 = 16$.
Given that the scale will be asymptotic and "fading" is very ill defined (subjective) I doubt that there IS anything like a solid answer to your question. Also you have not even defined "observer". Human eyes? With binoculars? With telescope? Space-based telescope? What ?
And also the apparatus used by the observer is irrelevant because the time at which the galaxy is no longer visible minus the time at which it's recessional velocity is c is the same regardless of the used apparatus.
And what I mean by fading is not just the fading of the galaxy from reduction of light intensity or redshifting (as despite those effects the galaxy is still visible when the recessional velocity is less than c). The fading I'm referring to is the fading due also to the reduced number of photon reaching an observer which becomes zero when the recessional velocity is around c (although always above c) as without such an effect the galaxy would never fade away so that it's not visible. More specifically the fading when and after it's recessional velocity is c.

Hopefully, this makes my question more valid!

And also the apparatus used by the observer is irrelevant because the time at which the galaxy is no longer visible minus the time at which it's recessional velocity is c is the same regardless of the used apparatus.
Seriously? You think that your eyes can see the same thing as the Hubble telescope? If by "visible" you mean "visible by any means" then yes, but you didn't say that, as I pointed out.

Seriously? You think that your eyes can see the same thing as the Hubble telescope? If by "visible" you mean "visible by any means" then yes, but you didn't say that, as I pointed out.
Apologises but yes "visible by any means" is what I meant.

Apologises but yes "visible by any means" is what I meant.
Good. Sorry to seem like a dick about it, but I find that poorly worded questions can evolve from poorly thought through questions so I always encourage people to be very specific with questions so as to get a useful answer.

Good. Sorry to seem like a dick about it, but I find that poorly worded questions can evolve from poorly thought through questions so I always encourage people to be very specific with questions so as to get a useful answer.
No worries at all! In fact it helped; my initial threads are always vague

Wouldn't that 'fading' take 'forever'? Once an object recesses faster than 'c' wouldn't the effect be the same as if it fell into a black hole? So the imprint of the galaxy would be visible for a long time, wouldn't it? Questions, questions...

And also the apparatus used by the observer is irrelevant because the time at which the galaxy is no longer visible minus the time at which it's recessional velocity is c is the same regardless of the used apparatus.
This isn't true. The time at which a galaxy is no longer visible has nothing to do with the recession velocity passing the speed of light. In fact, most of the galaxies in the observable universe are now and always have been receding at faster than the speed of light.

And what I mean by fading is not just the fading of the galaxy from reduction of light intensity or redshifting
That's the only fading that there is. Well, that and intervening matter blocking the light.

mfb
Wouldn't that 'fading' take 'forever'? Once an object recesses faster than 'c' wouldn't the effect be the same as if it fell into a black hole? So the imprint of the galaxy would be visible for a long time, wouldn't it? Questions, questions...
Eventually the wavelengths of the photons become longer than the Hubble distance, which makes them effectively impossible to detect.

This isn't true. The time at which a galaxy is no longer visible has nothing to do with the recession velocity passing the speed of light. In fact, most of the galaxies in the observable universe are now and always have been receding at faster than the speed of light.

That's the only fading that there is. Well, that and intervening matter blocking the light.
Once again may I refer to this thread
https://www.physicsforums.com/threads/reason-for-galaxy-fading.869954/. There you'll see that your second point is not correct for when a galaxy has a recessional velocity greater than c, increasingly fewer and fewer photons reach us until no photons reach us and the galaxy is no longer visible. Thus this factor also affects the fading. Please correct me and add further justification to your first point for no galaxies become invisible when their recessional velocities are smaller than c.

Chalnoth is right. Objects inside the event horizon remain observable forever, at least in principle. The frequency and intensity of their light goes down, but it does so asymptotically approaching 0 at infinite time in the future (which nets wavelenghts longer than observable universe much sooner than that). This becomes (hopefully) obvious as you look at a light cone graph of the evolution of the universe such as those below:

Tracing light emitted from any galaxy located within the event horizon it can be seen that it'll necessarily reach the observer some time in the future - location within or without the Hubble sphere has little bearing on it.
(graphs taken from: http://arxiv.org/abs/astro-ph/0310808)

Please let us know if light cone graphs are not easily understandable. Also, if you could indicate which part of the thread you're referring to you understand as suggesting a different conclusion.

Chalnoth is right. Objects inside the event horizon remain observable forever, at least in principle. The frequency and intensity of their light goes down, but it does so asymptotically approaching 0 at infinite time in the future (which nets wavelenghts longer than observable universe much sooner than that). This becomes (hopefully) obvious as you look at a light cone graph of the evolution of the universe such as those below:
View attachment 101650
Tracing light emitted from any galaxy located within the event horizon it can be seen that it'll necessarily reach the observer some time in the future - location within or without the Hubble sphere has little bearing on it.
(graphs taken from: http://arxiv.org/abs/astro-ph/0310808)

Please let us know if light cone graphs are not easily understandable. Also, if you could indicate which part of the thread you're referring to you understand as suggesting a different conclusion.
Apologises if I'm just being ignorant but I am under the impression that the increasing reduction of the wavelength of light is irrelevant because the reason for the fading of the galaxy until it is no longer visible is not due to this effect (if it was it would always be visible) but instead it's due to the fact that as galaxies have recessional velocities exceeding c, emitted photons will reach us if they pass a volume in the universe that isn't expanding greater than c, but as the space between us and the galaxy expands faster than light, any emitted photon will never reach us and the galaxy will never be visible.

Also I'm referring to the 1st page of the thread and thank you for your insights!

if it was it would always be visible
And they will be*, albeit in an increasingly dimmer, more redshifted, and more outdated fashion.

Keep in mind that the radius of the Hubble sphere depends on the value of the Hubble parameter, which is, has been, and will be always falling down, asymptotically approaching a constant value in the future as density of matter in the universe vanishes. This translates to an increasing Hubble sphere diameter. This means that photons today finding themselves outside the Hubble sphere will later cross into it (or rather, the sphere will end up encompassing these photons). That's why you can see today stuff that never was inside the Hubble sphere. (trace the line labelled 'lightcone' on the third graph, which shows signals reaching us today; every point along this line is an object we can now observe).

*with the caveat mentioned earlier, that you need increasingly larger instruments to detect longer-wavelength signals, at some point exceeding the size of the universe.

Also I'm referring to the 1st page of the thread

"farther away a galaxy the harder it is for it's light to reach us... at some point it won't be able to"

Also:
"If recession velocity at the location of a traveling photon were greater than the speed of light the entire time the photon from a distance galaxy were traveling, we would never observe the photon. A photon emitted from a galaxy moving away from us faster than light, initially is also receding from us. However, the photon may eventually get to a region of spacetime where recession from us is <c. In this case, the photon can reach us."

Apologises if I'm just being ignorant but I am under the impression that the increasing reduction of the wavelength of light is irrelevant because the reason for the fading of the galaxy until it is no longer visible is not due to this effect (if it was it would always be visible) but instead it's due to the fact that as galaxies have recessional velocities exceeding c, emitted photons will reach us if they pass a volume in the universe that isn't expanding greater than c, but as the space between us and the galaxy expands faster than light, any emitted photon will never reach us and the galaxy will never be visible.

Also I'm referring to the 1st page of the thread and thank you for your insights!
This impression is incorrect.

The issue is that in General Relativity, there is no unique definition for the velocities of far-away objects. You can compare velocities at a single point, but once you separate the objects velocity becomes ambiguous due to the curvature of space-time.

It's analogous to the following situation. Imagine two cars each heading north from the equator, but on opposite sides of the Earth. If you estimate their difference in velocity by measuring how much distance they have to close before they meet at the North Pole, then you could say that the two cars are moving towards one another. If instead you measure distance by how far each is from the South Pole, then you could say that they're moving away from one another. If you measure their distance across a line of constant latitude, then while they are at the equator they are neither moving closer nor further away.

There's fundamentally no way to say which of these definitions of velocity is right, and it's because we're measuring velocity across the curved surface of the Earth.

The velocities of far-away galaxies is similar, and it has an interesting consequence: there is no speed of light limit for far-away objects. Objects can never outrun a light ray, but once you look at comparing the speeds of two objects far away there just isn't any way to say whether or not they're moving faster than the speed of light relative to one another so the limit is meaningless.

"farther away a galaxy the harder it is for it's light to reach us... at some point it won't be able to"
This refers to the cosmological event horizon - it is indicated on the graphs. As you can see, it is a wholly different beast from the Hubble radius. Galaxies leaving the event horizon will never be able to send any more (i.e., 'new') signals to the observer. But, the signals they have sent before crossing the horizon will reach the observer - they will be increasingly stretched the closer tot the horizon the galaxy was when emitting the signal, so that the last signal before crossing the horizon will only reach the observer at infinity. As you can see from the graphs, we'll eventually, and always be able to observe objects currently within ~63 Gly.

"If recession velocity at the location of a traveling photon were greater than the speed of light the entire time the photon from a distance galaxy were traveling, we would never observe the photon. A photon emitted from a galaxy moving away from us faster than light, initially is also receding from us. However, the photon may eventually get to a region of spacetime where recession from us is <c. In this case, the photon can reach us."
The point is, the recession velocity is not constant - at a given distance, e.g. at what currently is the Hubble sphere, it goes down (and will be always), which is equivalent to increasing radius of the Hubble sphere, and allowing light from farther away to eventually reach us.

The point is, the recession velocity is not constant - at a given

I was under the impression that when the galaxy is at a distance of d from us when t is zero, when t > 0, d would have increased as the galaxy recedes further from us and thus so it's recessional velocity would've increased. Therefore, when t is zero and d is c/H as t increases so will d and thus so will the recessional velocity and therefore the emitted photons will never reach us as the recessional velocity increases more and more. Please correct me however.

This impression is incorrect.

The issue is that in General Relativity, there is no unique definition for the velocities of far-away objects. You can compare velocities at a single point, but once you separate the objects velocity becomes ambiguous due to the curvature of space-time.

It's analogous to the following situation. Imagine two cars each heading north from the equator, but on opposite sides of the Earth. If you estimate their difference in velocity by measuring how much distance they have to close before they meet at the North Pole, then you could say that the two cars are moving towards one another. If instead you measure distance by how far each is from the South Pole, then you could say that they're moving away from one another. If you measure their distance across a line of constant latitude, then while they are at the equator they are neither moving closer nor further away.

There's fundamentally no way to say which of these definitions of velocity is right, and it's because we're measuring velocity across the curved surface of the Earth.

The velocities of far-away galaxies is similar, and it has an interesting consequence: there is no speed of light limit for far-away objects. Objects can never outrun a light ray, but once you look at comparing the speeds of two objects far away there just isn't any way to say whether or not they're moving faster than the speed of light relative to one another so the limit is meaningless.
However as the distance traveled by the galaxy as it receeds is equal to the displacement of it, the galaxy receeds from us on a one- dimensional axis meaning it wouldn't be subject to any spacetime curvature and its velocity would therefore be definable.

I was under the impression that when the galaxy is at a distance of d from us when t is zero, when t > 0, d would have increased as the galaxy recedes further from us and thus so it's recessional velocity would've increased. Therefore, when t is zero and d is c/H as t increases so will d and thus so will the recessional velocity and therefore the emitted photons will never reach us as the recessional velocity increases more and more. Please correct me however.
Photons have what bapowell described in his insight article as peculiar velocity - i.e., they are moving towards us at c. So while a galaxy might be leaving the region we define as the Hubble sphere, the light it emits can still reach us.

Photons have what bapowell described in his insight article as peculiar velocity - i.e., they are moving towards us at c. So while a galaxy might be leaving the region we define as the Hubble sphere, the light it emits can still reach us.
I believe that is why galaxies don't fade away into invisibility at c but beyond it; for any photon emitted after this point will never reach us as the space between us and the galaxy expands greater than c.

You're correct as at the Hubble sphere emitted photons will reach us.

I believe that is why galaxies don't fade away into invisibility at c but beyond it; for any photon emitted after this point will never reach us as the space between us and the galaxy expands greater than c.
Not if the growth of the Hubble radius is fast enough. Consider a photon exactly at the Hubble radius. It will have 0 recessional velocity. A photon slightly (e.g. 1m farther) will have a very slow recession velocity - all you need, is for the decrease of the Hubble parameter (i.e., growth of the Hubble radius) to eventually overtake this recession velocity.

Not if the growth of the Hubble radius is fast enough. Consider a photon exactly at the Hubble radius. It will have 0 recessional velocity. A photon slightly (e.g. 1m farther) will have a very slow recession velocity - all you need, is for the decrease of the Hubble parameter (i.e., growth of the Hubble radius) to eventually overtake this recession velocity.
Of course! Do you know if this can or does happen?

Einstein's Cat
@Einstein's Cat, one of the things you need to keep in mind that I think perhaps hasn't quite hit you fully is that the point is NOT at all how fast something is receding, but how far emitted light has to travel to get to us (including the distance added by expansion). That is, a point that is a measly 1 light year from us could be receding at MANY multiples of c but the light emitted at that point would still reach us even though the space between us and the start of the wave would be expanding really really quickly.

Einstein's Cat
I was under the impression that when the galaxy is at a distance of d from us when t is zero, when t > 0, d would have increased as the galaxy recedes further from us and thus so it's recessional velocity would've increased. Therefore, when t is zero and d is c/H as t increases so will d and thus so will the recessional velocity and therefore the emitted photons will never reach us as the recessional velocity increases more and more. Please correct me however.
The problem with this is that H is not a constant. H decreases over time as the universe gets less dense.

Yes, if a galaxy is far enough away such that $Hd > c$ at the time the photon was emitted, then that photon, even though it is traveling in our direction, will get further away due to the expansion. But as H decreases, if the galaxy was close enough, eventually the photon starts to gain ground.

Einstein's Cat and Bandersnatch
The problem with this is that H is not a constant. H decreases over time as the universe gets less dense.

Yes, if a galaxy is far enough away such that $Hd > c$ at the time the photon was emitted, then that photon, even though it is traveling in our direction, will get further away due to the expansion. But as H decreases, if the galaxy was close enough, eventually the photon starts to gain ground.
Thank you very much for all your replies. This concept has become a lot clearer to me.

The problem with this is that H is not a constant. H decreases over time as the universe gets less dense.

Yes, if a galaxy is far enough away such that $Hd > c$ at the time the photon was emitted, then that photon, even though it is traveling in our direction, will get further away due to the expansion. But as H decreases, if the galaxy was close enough, eventually the photon starts to gain ground.
Is there an equation that describes how H changes with time?