1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Non radiative magnetic field in the MHz

  1. Oct 17, 2011 #1
    I came across an article about wireless energy transfer.

    In the article it spoke of non radiative magnetic field of several Mhz.

    As far as I was aware once you oscillate a magnetic field above a few kilohertz it radiates. How is a non radiative field of several megahertz possible?

    Or was this an April fools joke of old?
    If so then more fool them because they have started a company

  2. jcsd
  3. Oct 17, 2011 #2


    User Avatar
    Gold Member

    Here is Wiki that does a fairly decent description of near-far fields of an antenna.
    The system you cite uses the near field.

    I added a paragraph from the Wiki
    The "far-field", which extends from about two wavelengths distance from the antenna to infinity, is the region in which the field acts as "normal" electromagnetic radiation. The power of this radiation decreases as the square of distance from the antenna, and absorption of the radiation has no effect on the transmitter. By contrast, the "near-field", which is inside about one wavelength distance from the antenna, is a region in which there are strong inductive and capacitative effects from the currents and charges in the antenna, which cause electromagnetic components that do not behave like far-field radiation. These effects decrease in power far more quickly with distance, than does the far-field radiation power. Also, in the part of the near field closest to the antenna (called the "reactive near-field," see below) absorption of electromagnetic power in the region by a second device has effects which feed-back to the transmitter, increasing the load on the transmitter that feeds the antenna, by decreasing the antenna impedance that the transmitter "sees." Thus, the transmitter can sense that power has been absorbed from the closest near-field zone, but if this power is not absorbed by another antenna, the transmitter does not supply as much power to the antenna, nor draw as much from its own power supply. Finally, the "transition zone" between these near and far field regions, extending over the distance from one to two wavelengths from the antenna, is the intermediate region in which both near-field and far-field effects are important. In this region, near-field behavior dies out and ceases to become important, leaving far-field effects as the dominant interaction
  4. Oct 17, 2011 #3
    Thank you

    So its not that they don't have a radiative component to their system. its just that any radiative component is considered a loss.

    That would mean, If my assumptions are correct, that when a device to be charged is in the near field. most of the energy supplied by the source (antenna) would be coupled to the sink (device) and therefore most energy is absorbed by the sink. but remove the sink from the near field and some if not most of the energy from the source becomes radiative and wasted.

    It is easy to see that via a feedback system when the sink moves out of the near field, power could be reduced (or shut off) to minimize radiative losses. I mean over and above the reduction in power from reactive changes to the system.

    Why has it taken so long to for something like this to have become a commercial system?
    It would seem to me this knowledge is pretty old knowledge. Essentially its the gap between a transformer and an antenna.
    Its now obvious in my readings of Tesla's work that near field radiation is what he relies on in many of his experiments. in particular his wireless power system.

    If you use the earth as your antenna, then everything on the surface is in its near field.

    I wonder what the health implications are of (strong) near field radiation.

  5. Oct 18, 2011 #4


    User Avatar
    Gold Member

    Quote from the article:

    So 60% of the power is being lost. I bet that is why it has not been used before - the power wasted. But since nowadays, "wireless" is a catch phrase, they are attempting to follow the flow of technology. Mind you, I do not like a mess of wires so I would prefer that the system is perfected.

    Very good question. Would that be like having your own MRI at home - what frequency do they work at?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook