Main Question or Discussion Point
If the universe is expanding, then the brightness of distant stars should gradually decrease. Did anyone observe this phenomena of fading star light ?
Note that at 1Mpc, after 1 sec, the galaxy will have receded an additional 71 km. 1Mpc = 3 x 10^19 km, so it will increased its distance by a factor of 1.00000000000000000237, and will have decreased in brightness by a factor of 0.999999999999999995267.Rate of expansion is given my Hubbles constant H=71 km/s/Mpc.
So a Galaxy at 1 Mpc will be moving at 71 km/s and a Galaxy at 2 Mpc will be
moving at 142 km/s etc. Galaxy receding velocity would be too high at greater Mpcs. So should there be visible fading of the star over time at greater Mpcs ?.
If we note down the Brightness of the STAR now and calculate the brightness
after 5 or 10 Years then shouldn't we see some difference considering very high
receding velocities at higher distances ?
No, no, and no.So are our instruments not capable of detecting decrease in Brightness by a factor of 0.999999999626824 after 5 years?
Is it possible to observe decrease in brightness after 20 or 30 or 40....years ?
If So, then anyone detected it ?.
The enemy of my enemy is my friend.Comment on Bertrand Russell:
Was Bertrand Russell certain of his statement or doubtful ?
His statement contradicsts itself as if he is certain then he would be the one described in the first part of the statement .If he is doubtful then he cannot be correct though he is wise.