- #1

Shirish

- 244

- 32

Suppose we just consider a huge part of the universe that's empty. Some object emits a blue light wave from one end of this void towards an observer at the other end of the void. This space in this void itself will be expanding or "stretching" - so the light wave itself will get "stretched". But will the observer notice any difference in wavelength between the stationary-space and expanding-space cases?

I have this scenario in mind: suppose I notice some small rock A suspended in space that's at rest w.r.t. me, and another rock B at rest w.r.t. me but sufficiently far away from A, so that the gravitational attraction between them is negligible. I define the distance between A and B as "1 unit". If the space isn't expanding, let's say n crests of the light wave fit between A and B. But even if space expands, even though the light wave gets "stretched", the rocks A and B will also move away from each other and again n crests of the stretched wave will fit between A and B (

*assuming uniform expansion of space everywhere*).

In summary - initially we have a finer grid of space and small "ruler" to measure the distances, and later we have a stretched grid of space and a "stretched ruler" to measure the distances. So the notion of distance won't get altered, which means the wavelength of the emitted light will also remain the same, right?

Just want to understand the flaw in the above argument and clear up my concepts.