Yesterday,our lab members discussed about the diffraction of incoherent beam. I argued that if the beam is completely incoherent, it will not make diffraction patterns that follows sinc or bessel-like function. My argument is that the width of diffraction pattern is determined by light's coherence area. If coherence area is small, the diffraction pattern will have width wider than that of perfect coherent light. (We were talking about far-field diffraction using lens) If coherence area->0 , it will be spreaded all across the focal plane of the lens. I think such an effect is consistent with van Cittert-Zernike theorem. However, other lab members did not agree with me. They say that such a conclustion is different from our experience. Their examples are sunlight and fluorescent tube light. If we focus such incoherent light, we will not see the effect I argued above. I calculated the effect quantitatively today. It seems that coherence area of sunlight(~0.02 mm ) is too large to see the effect I described above,if we use a lens with focal length 10 cm or so. It means that even if intensity distribution at focal plane(of such a lens) seem to be focused,it is consistent with my argument. Anyway,I'm not sure if my method to calculate the effect is correct. I modeled partially coherent light as many independant wave packets that has small coherence area. I thought that the coherence area is like the slit width. Diffraction pattern width will be deteremined by the coherence area. I don't exactly know how to calculate the pattern width for partially coherent light. I'm reading Born and Wolf for this subject, but still confused.. Is there any compact explanation about this kind of effect?