- #1

edlin

- 7

- 0

**Energy Carried by Electromagnetic Waves- why is it wrong?**

Hi! I have this homework problem that I just don't understand why it's wrong.

**A monochromatic light source emits 110 W of electromagnetic power uniformly in all directions.**

(a) Calculate the average electric-field energy density 3.00 m from the source.

(b) Calculate the average magnetic-field energy density at the same distance from the source.

(c) Find the wave intensity at this location.

(a) Calculate the average electric-field energy density 3.00 m from the source.

(b) Calculate the average magnetic-field energy density at the same distance from the source.

(c) Find the wave intensity at this location.

First of all, I know that

**(a)**and

**(b)**will be the same. I have also calculate part

**(c)**, which is correct, and it is

**0.9726 W/m^2.**

I have used various equations that would work for this type of problems.

**1**. The Sav, or average Poynting vector, in terms of the magnetic and electric field. I always get the same answer, which is

**6.4*10^-8**. Then I use the energy density equation, which gives me

**3*10^-9**.

**2**. I use Sav=cu...since I have Sav, and I know c, then it seemed logical to use this. My answer is the same,

**3*10^-9**.

But its WRONG, and I don't understand

**WHY**! PLEASE HELP ME.

Thankyou!