## Shouldn't the electromagnetic fields in waves be 90 degrees out of phase?

Hi. All the books I read depict the electric fields and magnetic fields in electromagnetic waves as being in phase - meaning they reach their max or their min at the same time. The books also say that the changing electric field creates magnetic field and changing magnetic field creates electric field.

If a changing electric field is described by a sine function, then its greatest changing rate happens when it approaches zero and least when it approaches 1. Bigger electric field changing rate leads to bigger magnetic field magnitude, so that means as electric field approaches 0 (biggest changing rate), magnetic field approaches maximum. Which leads me to think that they should be 90 degrees out of phase with each other in a electromagnetic wave.

I have read also that near a transmitter, the fields are 90 degrees out of phase, but as they move farther away from a transmitter, they become in sync or in phase. Can someone explain why?
 PhysOrg.com physics news on PhysOrg.com >> Kenneth Wilson, Nobel winner for physics, dies>> Two collider research teams find evidence of new particle Zc(3900)>> Scientists make first direct images of topological insulator's edge currents
 for plane-waves (what you get when you are a long distance from the source), they are in-phase and the characteristic impedance ($Z_0$) of free space or whatever media they are in is the scaling factor between them.
 Hi rbj. Thanks for answering my question. Frankly, I don't really understand it. But it seems like I am not the only one. I searched around and found a paper written probably by some graduate student who seem convinced that electric field and magnetic field travel 90 degrees out of phase in electromagnetic waves. Here's a part of what he wrote: The Poynting Vector expects the E and H fields to be in phase to get Watts/m2 average real power flow. But how can this be when it is believed that the two fields support each other as they travel through space? The energy travelling in space would have sinusoidal variations in amplitude. If this is the case, then where is the energy coming from and going to as it travels through free space? Does this violate the conservation of energy law? I maintain that there is a phase difference of 90 degrees between the E and H fields. Further to that, I believe that EM energy moves between the E and H fields supporting each other as they travel. And lastly, that the sum of the E and H field energy densities would be constant over complete cycles and decrease by 1/r2 from the energy source. See the equations and waveforms below: (Note that in figures 1A and 1B, E0 and H0 represent the magnitudes of the Electric and Magnetic fields respectively, whereas the red U0 and blue U0 represent the magnitudes of the Electric and Magnetic field energy densities) The full paper can be found at: http://myweb.tiscali.co.uk/teslatutorial/EMCAH.doc it's only 3 pages long. I find it interesting. I hope to hear other people's opinions on it. Thanks

## Shouldn't the electromagnetic fields in waves be 90 degrees out of phase?

pizzadude - Take another look at Maxwell's Equations in free space. The relationships between the electric and magnetic fields involves the time derivative of one and the curl (a some of spatial derivative) of the others. The change in space of a sin function is cos, but so is the change in time, so it works out. The plane wave solutions of electromagnetic radiation have the fields in phase.

Recognitions:
Gold Member