# Spherical waves, Optics, phase, homework help please

• heycoa
In summary, the conversation discusses a problem involving a point source emitting a spherical wave with a wavelength of 500 nm. The observer is located far away and only interacts with the light across a small area, approximating the local wave as a plane wave. The question is how far the observer must be from the source so that the phase of the wave deviates by less than 36° over an illuminated spot 3.9 cm in diameter. The conversation covers the concept of phase difference and path difference, and the solution involves finding the distance d between the source and the illuminated spot, which is calculated to be approximately 3,802.5 m.
heycoa
The question is as follows:

A point source emits a spherical wave with λ = 500 nm. If an observer is far away from the source and is only interacting with the light across a small area, one can approximate the local wave as a plane wave. How far from the source must the observer be so that the phase of the wave deviates by less than 36° over an illuminated spot 3.9 cm in diameter?

Hello, heycoa. Try to give us some idea of what you understand and don't understand about the problem. Do you see why the phase of the wave differs for different points of the illuminated spot?

Well, I don't really understand how the phase differs. It says in the problem that the wave can be treated as a plane wave. I was thinking that the phase should remain throughout the lit spot.

heycoa said:
Well, I don't really understand how the phase differs. It says in the problem that the wave can be treated as a plane wave.
that's not all it says... it says that it can be treated as a plane wave across a small area. This bit is key.

The problem also tells you what kind of wave it is.
What kind of wave is it?

Can you sketch wave-fronts for such a wave? Do so.
The wavefronts are lines of equal phase.

If you draw a line some distance away from the source of the fronts - you can see how different parts of the line experience a different phase of the wave (not all the line is on the same wavefront).

Ok so its basically like the flat square wave-fronts, but when the reach the circular area, the center of the wave-fronts hit the area first?

Yes, that's right. For which points of the illuminated spot is there the greatest phase difference?

To me its very difficult to visualize. I am trying to work this out down on paper, but I just cannot make the connection between the circular area and phase difference. I do not know what is denoting a difference in phase.

I am just making a guess, but is the difference in phase the greatest when the wave is bisecting the circular area? Because more of the wave is covered up than the rest? This seems to be a very abstract way of calculating a phase difference. Maybe I am just slow :(

Have you studied the connection between "path difference" and phase difference? Which point(s) of the illuminated spot is (are) closest to the source? Which is (are) farthest from the source?

If I set an origin in the center of the illuminated area, then the point closest to the source is -1.95 cm, and the furthest would be 1.95 cm. I have not come across "path difference" yet.

Path difference just refers to the difference in distance that light travels along two different paths. The attached picture shows the illuminated spot from a point source S. Which point of the spot is closest to S and which point(s) is farthest?

#### Attachments

• Phase difference.jpg
13.2 KB · Views: 515
A is closest and B is furthest. It seems that I was totally confusing the setup. Also, I guess I did not realize that I could be using ray tracing. For some reason I was stuck with waves in my head.

Thank you, by the way, for spending so much time with me!

So this means that the longer path will result in a phase shift, since its path is further. I basically have to find the correct r that gives me a phase difference of 36 degrees. I was totally over thinking the problem!

actually I need to find the correct distance from the illuminated area

so can I use the angle between the hypotenuse and d? Does setting that angle to 36 degrees produce the correct phase shift?

heycoa said:
so can I use the angle between the hypotenuse and d? Does setting that angle to 36 degrees produce the correct phase shift?

No, that's not going to get it.

The attached drawing shows a red wavefront just reaching the center of the spot and the path difference is shown in blue.

If the path difference h - d happened to equal one-half wavelength, then the phase difference between the light reaching A and the light reaching B would be 1/2 of 360o = 180o. (One full wavelength corresponds to a phase difference of 360o)

What fraction of a wavelength would the path difference need to be to get a phase difference of 36o?

#### Attachments

• Phase difference path diff.jpg
15.5 KB · Views: 615
Last edited:
1/10 of the 500 nm wavelength

Right. So you need to find the distance d such that h - d = λ/10. Hint: express h in terms of d and r using geometry.

I don't know if i did this right, but I considered the path difference to be only 50 nm, then I used Pythagoras theorem as such: x^2 + .0195^2 = (x+50*10^-9)^2. I solved for x, and I got 3,802.5 m. Does this appear correct to you?

in this case x = d and (x+50*10^-9) = h

heycoa said:
I don't know if i did this right, but I considered the path difference to be only 50 nm, then I used Pythagoras theorem as such: x^2 + .0195^2 = (x+50*10^-9)^2. I solved for x, and I got 3,802.5 m. Does this appear correct to you?

Yes, I think that's right. (I'm a bit surprised at how large the distance d turns out to be. Over 2 miles! But the calculation appears correct.)

Good work.

I cannot thank you enough for your time, patience, and energy in helping me. Thank you very very much, I really appreciate it. You helped me learn something that I should have already and stuck with it. Thanks again TSny!

P.S. How do I give you points or the medal for helping me?

Glad to help. ( No points or medals around here. Wouldn't know what to do with 'em if I got 'em )

## 1. What are spherical waves in optics?

Spherical waves in optics refer to waves that emanate from a point source and propagate in a spherical shape. They can be described mathematically using Huygens' principle, which states that every point on a wavefront acts as a source of secondary spherical waves.

## 2. How is phase related to spherical waves in optics?

Phase in optics refers to the relative position of two waves at a given point in time. In the case of spherical waves, the phase can be used to determine the position of the wavefront, which is important in understanding how the wave will propagate and interfere with other waves.

## 3. What is the difference between coherent and incoherent spherical waves?

Coherent spherical waves refer to waves that have a constant phase relationship, meaning that their wavefronts are always in sync. Incoherent spherical waves, on the other hand, have varying phase relationships and their wavefronts do not stay in sync.

## 4. How can spherical waves be used in optics applications?

Spherical waves have a variety of applications in optics, including in holography, diffraction, and imaging. They can also be used to study the behavior of light in different mediums and to understand the principles of interference and diffraction.

## 5. Can you provide any tips for completing homework on spherical waves and optics?

When completing homework on spherical waves and optics, it is important to understand the fundamental concepts and equations. Practice drawing diagrams and solving problems to solidify your understanding. Additionally, try to relate the concepts to real-world applications to better understand their significance.

Replies
1
Views
2K
Replies
1
Views
3K
Replies
1
Views
2K
Replies
3
Views
2K
Replies
2
Views
4K
Replies
12
Views
2K
Replies
4
Views
4K
Replies
3
Views
4K
Replies
5
Views
3K
Replies
1
Views
1K