Radio Comm Between 2 Points Diff Gravity: True?

Click For Summary

Discussion Overview

The discussion revolves around the effects of gravitational time dilation on radio frequency signals transmitted from a source near a black hole to a receiver at a different gravitational potential. Participants explore the implications of this phenomenon on the frequency of the received signal and whether similar effects apply to light signals, considering the complexities of General Relativity.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant describes a scenario where a signal transmitted from a source near a black hole experiences significant time dilation, suggesting that the frequency received would be drastically lower than the transmitted frequency due to gravitational effects.
  • Another participant confirms the idea of gravitational redshift and provides a link to further information, indicating agreement with the initial premise.
  • A different participant questions the terminology used, suggesting that "stretched waves" may lead to confusion and emphasizes the importance of distinguishing between gravitational effects and velocity effects in the context of time dilation.
  • This participant also provides a mathematical expression for the frequency ratio between the received and transmitted signals, highlighting the role of gravitational potential differences.
  • One participant reiterates the original scenario, suggesting a change in terminology from "orbiting" to "hovering" to clarify the conditions under which the time dilation occurs, particularly in relation to stable orbits near black holes.

Areas of Agreement / Disagreement

Participants generally agree on the existence of gravitational time dilation affecting the frequency of signals, but there are nuances in terminology and interpretation of the scenario, indicating that multiple views remain on the specifics of the situation.

Contextual Notes

Participants express uncertainty regarding the implications of gravitational effects on both radio and light signals, and there are unresolved aspects concerning the stability of orbits near black holes and the definitions used in the discussion.

Hongo
Messages
1
Reaction score
0
A source that is orbiting close to a singularity of a black hole is transmitting a radio frequency signal that lasts 60 seconds and is repeated infinitely. The signal is being transmitted using the amplitude modulation method (AM Radio). Let suppose that each minute passing in the transmitting source location is equivalent to 100 minutes for the receiver, and that the distance between the transmitter and receiver is time invariant. For this situation to be consistent, it must mean that the electromagnetic wave is stretched. Therefore, if the original signal was modulated on a 1000 kHz electromagnetic wave, the wave received must be of 10 kHz. That would be a similar effect as a doppler effect but the shift of the frequency is produced by gravity instead of the velocity. Is this statement true? Would this mean also that the frequency of a source of light will different depending on the curvature of space-time in the location of the source and the observer? My field is Chemistry so I am sorry in advance for my lack of expertise in General Relativity.
 
Physics news on Phys.org
I guess you meant to write "orbiting close to the event horizon of a black hole"?
Hongo said:
For this situation to be consistent, it must mean that the electromagnetic wave is stretched. Therefore, if the original signal was modulated on a 1000 kHz electromagnetic wave, the wave received must be of 10 kHz.
Yes, but it is not good to view it as "stretched waves", because it tends to conjure up images of 'expanding space' between the source and receiver, which can be very confusing. It is simply gravitational (and velocity) time dilation due to speed and gravitational potential differences between the source and receiver.
If you ignore any relative orbital speeds between source and receiver, the frequency ratio is
\frac{f_r}{f_s}= \frac{\sqrt{1-2GM/r_s c^2}}{\sqrt{1-2GM/r_r c^2}}
where the subscripts indicate source and receiver and r are the orbital radii.
 
Hongo said:
A source that is orbiting close to a singularity of a black hole is transmitting a radio frequency signal that lasts 60 seconds and is repeated infinitely. The signal is being transmitted using the amplitude modulation method (AM Radio). Let suppose that each minute passing in the transmitting source location is equivalent to 100 minutes for the receiver, and that the distance between the transmitter and receiver is time invariant. For this situation to be consistent, it must mean that the electromagnetic wave is stretched. Therefore, if the original signal was modulated on a 1000 kHz electromagnetic wave, the wave received must be of 10 kHz. That would be a similar effect as a doppler effect but the shift of the frequency is produced by gravity instead of the velocity. Is this statement true? Would this mean also that the frequency of a source of light will different depending on the curvature of space-time in the location of the source and the observer? My field is Chemistry so I am sorry in advance for my lack of expertise in General Relativity.

Essentially, yes. Though I'd change "orbiting" to "hovering" - unless you have a rapidly spinning black hole like the one Kip Thorne imagined in "Interstellar", you won't get a stable orbit with that sort of time dilation, and keeping the distance / propagation delay to the receiver time invariant is easier, too.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
7K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 68 ·
3
Replies
68
Views
6K
  • · Replies 95 ·
4
Replies
95
Views
8K
Replies
5
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 12 ·
Replies
12
Views
11K
  • · Replies 9 ·
Replies
9
Views
8K