so, a radio wave would travel the distance from point A to point B in constant time T, no matter what stands between this two points (buildings, trees ...) ?
Is there an estimation on how much velocity is reduced (average) in using radio waves in cities or populated regions (trees, hills, houses, etc.) ?
Is this frequency dependent?
thx
Miha
first HI to everyone!
I have a question:
is there a (simple) way to reduce radio wave speed at it's source - transmitor (e.g. to 100 or 1000x less then the speed of light)?
thx in advance