- #1
pizzadude
- 8
- 1
Hi. All the books I read depict the electric fields and magnetic fields in electromagnetic waves as being in phase - meaning they reach their max or their min at the same time. The books also say that the changing electric field creates magnetic field and changing magnetic field creates electric field.
If a changing electric field is described by a sine function, then its greatest changing rate happens when it approaches zero and least when it approaches 1. Bigger electric field changing rate leads to bigger magnetic field magnitude, so that means as electric field approaches 0 (biggest changing rate), magnetic field approaches maximum. Which leads me to think that they should be 90 degrees out of phase with each other in a electromagnetic wave.
I have read also that near a transmitter, the fields are 90 degrees out of phase, but as they move farther away from a transmitter, they become in sync or in phase. Can someone explain why?
If a changing electric field is described by a sine function, then its greatest changing rate happens when it approaches zero and least when it approaches 1. Bigger electric field changing rate leads to bigger magnetic field magnitude, so that means as electric field approaches 0 (biggest changing rate), magnetic field approaches maximum. Which leads me to think that they should be 90 degrees out of phase with each other in a electromagnetic wave.
I have read also that near a transmitter, the fields are 90 degrees out of phase, but as they move farther away from a transmitter, they become in sync or in phase. Can someone explain why?