# Star Light Fading - Question

## Main Question or Discussion Point

If the universe is expanding, then the brightness of distant stars should gradually decrease. Did anyone observe this phenomena of fading star light ?

Related Astronomy and Astrophysics News on Phys.org
This wouldn't affect any of the stars we can see with our naked eye, as the expansion does not cause individual galaxies to expand. As for more distant objects, you're right in principle; but, this happens so slowly that we really wouldn't be able to detect it.

Rate of expansion is given my Hubbles constant H=71 km/s/Mpc.

So a Galaxy at 1 Mpc will be moving at 71 km/s and a Galaxy at 2 Mpc will be
moving at 142 km/s etc. Galaxy receding velocity would be too high at greater Mpcs. So should there be visible fading of the star over time at greater Mpcs ?.

For example
If we note down the Brightness of the STAR now and calculate the brightness
after 5 or 10 Years then shouldn't we see some difference considering very high
receding velocities at higher distances ?

Last edited:
russ_watters
Mentor
No, because the ratio of distance to expansion rate is where fading would come from. More distant galaxies are moving away faster, but they are further away, so percentagewise they still aren't opening the distance up at a rate we could see like that.

Janus
Staff Emeritus
Gold Member
Rate of expansion is given my Hubbles constant H=71 km/s/Mpc.

So a Galaxy at 1 Mpc will be moving at 71 km/s and a Galaxy at 2 Mpc will be
moving at 142 km/s etc. Galaxy receding velocity would be too high at greater Mpcs. So should there be visible fading of the star over time at greater Mpcs ?.

For example
If we note down the Brightness of the STAR now and calculate the brightness
after 5 or 10 Years then shouldn't we see some difference considering very high
receding velocities at higher distances ?

Note that at 1Mpc, after 1 sec, the galaxy will have receded an additional 71 km. 1Mpc = 3 x 10^19 km, so it will increased its distance by a factor of 1.00000000000000000237, and will have decreased in brightness by a factor of 0.999999999999999995267.

At 2Mpc, after 1 sec, the galaxy will have receded an additional 142 km. Thus it will have increased its distance by a factor of 1.00000000000000000237 and will have decreased in brightness by a factor of 0.999999999999999995267. This is the same relative brightness decrease seen at 1Mpc.

No matter how far away the galaxy is, the relative decrease in brightness over time remains constant. Over, say, 5 years this works out to about a factor of 0.999999999626824

Thank You for explaining.

So are our instruments not capable of detecting decrease in Brightness by a factor of 0.999999999626824 after 5 years?

Is it possible to observe decrease in brightness after 20 or 30 or 40....years ?
If So, then anyone detected it ?.

Comment on Bertrand Russell:
Was Bertrand Russell certain of his statement or doubtful ?
His statement contradicsts itself as if he is certain then he would be the one described in the first part of the statement .If he is doubtful then he cannot be correct though he is wise.

Last edited:
russ_watters
Mentor
So are our instruments not capable of detecting decrease in Brightness by a factor of 0.999999999626824 after 5 years?

Is it possible to observe decrease in brightness after 20 or 30 or 40....years ?
If So, then anyone detected it ?.
No, no, and no.

Such a measurement isn't really feasible, as you'd need to be using the same detector 40 years after the first measurement. I doubt our technology today could measure a millionth of a percent change in brightness and we certainly couldn't 40 years ago.
Comment on Bertrand Russell:
Was Bertrand Russell certain of his statement or doubtful ?
His statement contradicsts itself as if he is certain then he would be the one described in the first part of the statement .If he is doubtful then he cannot be correct though he is wise.
The enemy of my enemy is my friend.