Microwave ovens versus WiFi signals

  • Context: High School 
  • Thread starter Thread starter JimiJams
  • Start date Start date
  • Tags Tags
    Microwave Signals Wifi
Click For Summary

Discussion Overview

The discussion centers on the safety and effects of microwave radiation from microwave ovens compared to WiFi signals, both of which operate at similar frequencies. Participants explore the reasons for differing safety perceptions and the mechanisms of how these waves interact with matter, particularly in relation to heating water molecules in the body.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants note that microwave ovens emit 2.45 GHz frequencies and are designed to heat water molecules, raising concerns about safety when in proximity to them.
  • Others argue that the wattage of microwave ovens (around 1,000 watts) is significantly higher than that of WiFi devices (less than 1 watt), suggesting this difference in power output accounts for the safety concerns.
  • A participant mentions that unless the shielding of a microwave oven is damaged, the risk of harm is practically zero, even when standing close to it.
  • There is a discussion about the nature of electromagnetic (EM) waves, with some participants questioning why microwaves are considered more dangerous than infrared waves, despite infrared having higher energy per photon.
  • One participant explains that microwaves are classified as non-ionizing radiation and cannot cause ionization, but they can still heat internal organs without the immediate pain response that skin burns from infrared might cause.
  • Another participant raises the point that while infrared waves have more energy per photon, microwaves can penetrate the skin differently, leading to confusion about their relative dangers.
  • Some participants express uncertainty about the relationship between wavelength, frequency, and penetration of EM waves, indicating a need for further exploration of electromagnetic theory.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the safety of microwaves versus WiFi signals, with multiple competing views on the mechanisms of interaction between EM waves and biological tissues remaining unresolved.

Contextual Notes

The discussion highlights limitations in understanding the interaction of different EM waves with matter, particularly regarding the nuances of penetration and energy transfer, which are not fully resolved.

JimiJams
Messages
53
Reaction score
0
Hi, just a simple question. We've always been told that microwave ovens are dangerous and to not stand near them when they're on, obviously because they have the ability to heat up water molecules. I just read that the microwave frequencies they emit are 2.45 GHz. Wifi signals consist of 2.4 GHz frequencies. Why are we not concerned with wifi signals? Is it because the concentration of waves is so much lesser than microwave ovens?
 
Science news on Phys.org
JimiJams said:
Hi, just a simple question. We've always been told that microwave ovens are dangerous and to not stand near them when they're on, obviously because they have the ability to heat up water molecules.

Unless the shielding is damaged you are at practically zero risk even when standing next to them, on top of them, or under them.

I just read that the microwave frequencies they emit are 2.45 GHz. Wifi signals consist of 2.4 GHz frequencies. Why are we not concerned with wifi signals? Is it because the concentration of waves is so much lesser than microwave ovens?

Pretty much. My microwave puts out 1,000 watts of power as microwaves. A standard wifi device puts out less than 1 watt as microwaves and isn't bottled up in a small box where the microwaves will bounce around until absorbed. You're body only get's exposed to a very small percentage of this small amount thanks to the transmitter being mostly omnidirectional, and out of this small amount that goes through you very little is actually absorbed. Since microwaves are far too low in energy per photon to ionize molecules, they don't do any damage to you when absorbed. They act similar to visible light in that they simply heat you up an absurdly tiny amount. The light bulb in my lamp across the room heats me up more than my wifi router does.

It's comparable to having a campfire in your oven vs a candle on your desk.
 
Thanks that's just what I was thinking. Although, drakkith, I remember my chemistry book distinguishing microwaves as a dangerous form of low-energy radiation. More dangerous than infrared and visible light because they have the ability to heat up water molecules inside our bodies. That never made sense to me because if all EM waves are essentially the same thing but at different frequencies, how can a microwave be more dangerous than the waves with more energy? Wouldn't x amount of infrared more effectively heat up a given amount of water than x amount of microwaves?
 
JimiJams said:
Thanks that's just what I was thinking. Although, drakkith, I remember my chemistry book distinguishing microwaves as a dangerous form of low-energy radiation. More dangerous than infrared and visible light because they have the ability to heat up water molecules inside our bodies. That never made sense to me because if all EM waves are essentially the same thing but at different frequencies, how can a microwave be more dangerous than the waves with more energy? Wouldn't x amount of infrared more effectively heat up a given amount of water than x amount of microwaves?

Higher frequency EM waves have different photon energies. A photon is a little packet of energy that an EM wave interacts with matter through. It can only interact in these little packets, passing only that exact amount of energy. Very high frequency EM waves, such as UV and X-Ray waves have photons so energetic that each packet deposits so much energy all at once that they cause electrons to become unbound from molecules and atoms. AKA ionization. This is the reason UV and higher frequencies are dangerous. They can completely destroy molecular bonds, which in living organisms leads to things like DNA damage and other effects.

Now, down at the microwave and infrared range things are similar, but the photons have so little energy that they can't deposit enough energy to a single electron all at once to cause ionization. Thus they are called non-ionizing radiation. The reason microwaves are considered more dangerous is this:

Consider a scenario in which you place an infrared emitter putting out 10,000 watts right in front of your stomach. All of those 10,000 watts are deposited into your skin, causing it to heat up very quickly and start to burn. BUT everything else behind your skin is untouched. (Unless you just stand there for a bit of course)

Now, if we had a 10,000 watt microwave emitter you have those same 10,000 watts are being beamed through you in the form of microwaves. Now, instead of your skin taking the full force of the heating, you have most of your internal organs being blasted by microwaves and heating up. The kicker is that inside your organs you don't have nerves that detect heat. (Not that I know of at least) So not only are your internal organs being baked, you don't even really realize it's happening. Sure you may feel some pain and discomfort, but it's not the horrible burning sensation you have if your skin is being blasted with infrared radiation.

In reality people don't stand in front of 10 KW emitters like this, but they do sometimes accidentally stand in front of high power transmitters like Radars. Since they may not feel extreme pain immediately they may not realize the transmitter is turned on and that they are being injured, which can lead to severe internal injuries and death.
 
All radiated equipment cell phones, radios, microwaves, xrays etc have safe threshold limits. STL takes into consideration its power, length of time of exposure, and leakage.
 
That's what I don't get, Drakkith. If infrared waves contain more energy per photon than microwaves, why is that microwaves can penetrate our skin but infrared cannot? It seems to make microwaves the oddball of the EM spectrum.
 
JimiJams said:
That's what I don't get, Drakkith. If infrared waves contain more energy per photon than microwaves, why is that microwaves can penetrate our skin but infrared cannot? It seems to make microwaves the oddball of the EM spectrum.

Why's that? Did you know that everything of a lower frequency than Infrared penetrates our skin, including radio waves? (And to be honest, your skin isn't a perfectly opaque material. I can see a flashlight through the skin on my hand/fingers) And once we get into the X-Ray range and above they start to penetrate again. The reason is extremely complicated and involves EM theory. I couldn't hope to explain it, as I don't understand the details myself. Just remember that it's the wavelength that determines both the energy per photon AND the interaction between the EM wave and matter.
 
The wavelength and frequency are inversely proportionate so you could probably say either is responsible. So lower frequency waves will penetrate our skin, but then IR, visible and UV do not, but everything higher than that begins to penetrate our skin again? That's very strange, I'll have to do some reading on EM waves. I thought there was a simple trend to these waves (shorter wavelength=more energy), but it seems there's more going on.
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
15K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
8K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 7 ·
Replies
7
Views
5K
  • · Replies 3 ·
Replies
3
Views
31K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K