Ippolitos
- 4
- 0
I have a problem understanding the intensity of light. Let's assume a monochromatic light of a single wavelength. The energy of the wave is constant and equal to h* f where h is the Planck's constant and f the frequency of the wave. The intensity of this wave is its energy divided by a given area.
What I can't understand is how two sources emitting the same monochromatic wave to the same area can have different intensity. The area is the same, h is stable and if we change f then we talk about a different wave.
What do I miss?
What I can't understand is how two sources emitting the same monochromatic wave to the same area can have different intensity. The area is the same, h is stable and if we change f then we talk about a different wave.
What do I miss?