# Homework Help: Two Source Interference

1. Sep 7, 2007

### Funktimus

Hello, hope you all had a good summer

The questioned posed is:

Why are the two-source interference equations not valid for light from an incandescent bulb that shines onto a screen with a single slit, and then the light shines onto a screen with two slits in it and the light from the two slits finally shines onto a nearby screen?

and the choices are:
1. not monochromatic sources
2. incoherent sources
3. observed from a distance similar to or smaller than the separation between the sources

my thoughts:

i think 3 is one of the correct answers since its mentions "onto a nearby screen" but as for the other 2 I'm lost. My professor and book both said that the distance to the viewing screen has to be much greater than the separation of the 2 slits.

2 I doubt is a choice since nothing I've read or heard suggests they would be out of phase when the waves leave the "two" sources.

And as for 1 "not monochromatic sources." Would the waves first being diffracted before reaching the 2 slits make them have significantly different wave-lenghts?

I'm thinking 1 and 3 but without confidence and confusion. thoughts please.

Regards
Funky

2. Sep 7, 2007

### Dick

A source can be emitting many photons, which may be of different frequencies (non-monochoromatic) or of different phases (incoherent). Either effect will wipe out interference effects. Distance to the viewing screen being large has more to do with the validity of geometric approximations than the actual presence of interference, but in any event would have nothing to do with the source being incandescent. Now what do you think?

Last edited: Sep 7, 2007
3. Sep 8, 2007

### Funktimus

I'm confused differently now. You say...

"A source can be emitting many photons, which may be of different frequencies (non-monochoromatic) or of different phases (incoherent). Either effect will wipe out interference effects."

It's been to my understanding that a single source (such as a light bulb, laser or sun,) would not prevent interference. My reasoning is, isn't that what Young did back in 1801? He let sunlight through 2 slits and they made a bunch of dark and light fringes on a screen, they interfered. So why would a light bulb be any different? Is there something significant about a light from an incandescent source? Or does the single slit diffraction before the double slit interference cause the source to be non-monochromatic (and maybe incoherent)?

4. Sep 8, 2007

### Dick

I'm now confused as well. I misspoke, sorry. I don't think incoherence would be an issue for this type of experiment. The first slit ensures that there are only two well defined paths to take to the screen - and the difference in path length together with the wavelength the determines the interference at any given point. But I don't know how he dealt with the monochromatic issue. Do you? I would think sunlight would be enough of a mix of colors to make interference difficult to observe. Am I wrong? You could always pass the sunlight through a prism before it hits the first slit to improve that.

5. Sep 8, 2007

### Funktimus

Yeah I don't think incoherence is an issue either. I remember my professor saying Young did the two slit experiment with sunlight because it was the only way he could ensure the waves would be in phase, since its just one source "separated into two."

But it does infact turn out to be non-monochromatic.

Thanks for help

6. Apr 23, 2009

### chocolatier

it turns out that they are coherent, but apparently not monochromatic... which makes me worry because of what you said about Young... I'm not entirely sure why they would be considered coherent but not monochromatic - nothing in the question indicates as much... basically, i'm confused as ever, but the answer would be choices 1 and 3