Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Wavelength & Distance Relation in MICROWAVE

  1. Jul 4, 2005 #1
    Hey All !

    i am basically a computer networks guy, exploring the telecom internals if i am so newbie please forgive me ! my question is that ! Is there any relation between the Wavelength and the distance covered by a MICOR WAVE when radiated through an anteena.

    Bascially i am confuse, what difference a wavelength make in the tranmisstion of a wave ? like do Higher Wavelength waves need less power to radiate and vice vers.

    Also what is the maximum range a micro-wave can travel or it can travell to unlimited distance (increasing the radiation in the area as a drawback ?) .

    I would be grateful if someone clear this up.

    Shakeel Ahmad
  2. jcsd
  3. Jul 4, 2005 #2


    User Avatar

    Higher frequencies make it easier to transmit. A good example: at the latter part of the 19th century, 2 different companies were fighting to make electricity. One with AC and the other with DC. The company making AC eventually won because the higher frequency makes electrcity travel longer distances without losing power (AC has 60Hz an DC has 0HZ).

    Microwaves can travel unlimited distances just like light. They are both electromagnetic waves.

    The shorter the wavelength, the higher the frequency. Antennas are built taking this into consideration.
  4. Jul 4, 2005 #3
    Thanx AMT !

    - you said microwaves can travell for unlimited distance ! Practically speaking for transmitting m.w for unlimited or much broader distance , we need to radiate the signal with hight power and it will create much heat and radiation (SAR?) which can be harmfull ?? am i right ?

    - and, talking about 2.4Ghz Wireless Networks, how can a small wireless card transmitt a powerfull signal to sendback the data to a 1KM far away hotspot. Or a small GSM handset sends data to a 20KM far away Base station. Do they have enough power to transmit the signal to that much distance ?

    - Is there any formula via we can find out , if we apply that much current to the Microwave, it will cover that much distance.

    - Talking about Microwave ovens , they work on 2.4 Ghz, so will a 2.4Ghz radio link transmitter possess the same qualities like giving heat bruns etc to a human if touched while transmittion.

    from above questions i know you can judge how much i am confused. Please if you can clear.

    Shakeel Ahmad
    Lahore, Pakistan.
  5. Jul 4, 2005 #4

    Claude Bile

    User Avatar
    Science Advisor

    I will (try to) answer your questions point by point.

    1. Essentially you are correct, to be able to detect a signal further away, we need to increase the radiated power. Also note that the information that can be sent over a channel is fundamentally limited by signal to noise ratio, so increasing the power of our antenna can also (potentially) increase the rate at which information can be sent.

    Currently, in Australia, the radiation limit from microwave towers is something like 1 mW/cm^2. Most countries have such limits because the long term effects of long-wave electromagentic radiation is not well understood.

    2. Not my field of expertise, sorry!

    3. Radiation patterns vary greatly from antenna to antenna. I don't think there is a general relationship that depends on current only. One would also have to take into account the geometry of the antenna.

    4. Whether you get burnt or not depends on the power of the antenna. If the power is sufficiently high (and your hand is sufficiently close), then yes, you will get burnt.

  6. Jul 4, 2005 #5


    User Avatar

    1-Yes there are SAR limits. I think the limit is something like 1.6W/KG in the USA. Getting too close to a high Watt transmitter can be very dangerous. I have heard of accidents and people getting seriously hurt.

    2-Cellphones transmit at 1W of power. More than enough to transmit to about 10-20 KM. But that is all they will transmit, nothing more. This is the basis for 'cell phone' technology. Each cell (area in a town or city) is about 10KM in radius.

    3-Yes there is distinct forumulas to calculate the Power needed to transmit certain distances. The power of the signal decreases by half, for twice the distance travelled. This is an inverse square law. But I don't remember much right now. I will have to open my text books to review. Here is a link with the formulas: https://ewhdbks.mugu.navy.mil/one-way.htm [Broken]

    4- Already answered in 1.
    Last edited by a moderator: May 2, 2017
  7. Jul 4, 2005 #6


    User Avatar
    Gold Member

    Yes, it will be harmful if you get too much power. One of my professors use to work as a navy technician. He told us one day about how someone was fixing a radar and stepped infront of the radar and someone accidently turned the radar on... ouch.....

    A wireless card would not be able to unless the hotspot is incredibly sensitive (not normal commercial products). Or you could put more power into the wireless card but the thing probably wouldnt work and some stuff might fry inside. One could be built of course to do the job but it probably woudl cost a lot more (and thus, not really commercially viable to normal users). Its all possible but the reason we dont see them is that itd probably cost a lot (and they may exist... but would be from special companies and cost a lot).

    Like someone said i believe, they will propogate forever. The problem is that they will deminish in strength based on their distance. To add to that, whatever is inbetween the transmitter and receiver will absorb some of the energy! Thus, a transmitter may be able to send a signal 30 miles to a receiver but if you put a big wall in the way, it may very well not be able to penetrate 10 feet of that wall depending on what its made of! There are materials being used now-a-days for corporate security that completely block all microwave signals from leaving a buildings walls. I think they put osmething in the paint that just absorbs everything.

    Yes. The thing though is that a little router anyone can buy at compusa or best buy or any electronics store will transmit at maybe 100mW/cm^2? I dunno... something really small like that. Microwave ovens however, send out microwaves at up to hundreds of watts/cm^2.

    Dont take my word for it, lets see if someone can verify what im talken about.
  8. Jul 4, 2005 #7
    I realy appreciate the reply of you all ! it realy helped me to clear the concepts. :)

    Shakeel Ahamd
    Lahore, Pakistan.
  9. Jul 5, 2005 #8


    User Avatar
    Gold Member

    Well your welcome :D Come back when you have aditional questions.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook