[Ray Tracing] Wavefronts & Reception Sphere

AI Thread Summary
Wavefronts in ray tracing refer to the local plane waves associated with each ray, which are perpendicular to the rays themselves. The reception sphere is a conceptual tool that determines whether a ray is received by a receiver based on its position relative to the sphere's boundary, which is defined by the angular separation and distance traveled by the ray. Only one ray should contribute to the total power received, even if it reflects back to the receiver, to avoid double counting. The double count problem arises when overlapping wavefronts from different rays are considered, complicating the accurate measurement of received power. Understanding these concepts is crucial for effectively applying ray tracing in optical systems.
whitenight541
Messages
48
Reaction score
0
Hi all,

I'm confused about the concept of wavefronts in ray tracing .. each ray is considered a wavefront? or what exactly is a wavefront in ray tracing?

In the reception sphere, it is mentioned that only one ray should be received from an actual path. I don't get it .. does this mean that if a ray is received then after some tracing the ray is reflected and reached the receiver again it shouldn't contribute again to the total power received?

Some papers also describe the double count problem. I don't understand what this problem is about .. I think it has something to do with wavefronts (which I'm confused about)

thanks in advance
 
Physics news on Phys.org
I'm a little confused by your terms- I don't know what a 'reception sphere' is.

In geometrical optics, the rays are normal to the wavefront, but the wavefront is usually not something to consider in geometrical optics. Aberrations are treated differently in ray optics vs. wave optics.
 
Each ray represents a "local" plane wave. The wave front is simply a plane wave that is normal to the ray and has an area defined by the ray tube (which expands due to dispersion as the ray travels).

I am not sure about what this reception sphere is or about how you expect a ray to contribute to the total power. If I recall correctly, no ray is used for the observeables. The rays are used to find the excited surface currents on your scatterer. Then, you take the currents and integrate them with the Dyadic Green's function to find the scattered fields. The direct field is a separate problem, which I guess you could use a "ray" to figure out as well but really you define the excitation in the beginning, this is known and so the direct field is a separate and easier problem.

I can't remember what double counting is, I read about in the documentation but I can't remember what it is.
 
The reception sphere is a technique to determine with rays are actually received by a receiver. It constructs a sphere around the receiver with radius proportional to the angular separation between rays and the total unfolded distance traveled by the ray. If the ray lies within the sphere then it is received and it contributes to the total field at that receiver.

I think I understood the double count problem:

Apart from ray tracing, we can imagine the waves emitted from the source as spherical waves increasing in size as they move away from the source. The wavefront is spherical in that case. If we divide the wavefront (at distance r) which is a sphere using hexagons, I think each of these hexagon would represent the wavefront of a ray. Each ray has a well defined non-overlapping wavefront with the neighboring rays.

If we return to the reception sphere concept, we construct the sphere about the receiver and say that the ray is received if the ray lies within that sphere. We can reverse things a little bit and say that the ray is received if the receiver lies within the wavefront of the ray. The wave front is hexagonal while the reception sphere is obviously spherical. The hexagonal shape is approximated by a sphere and that causes the double count problem (since now parts of the wavefronts overlap)

Does this make any sense? :D
 
So I know that electrons are fundamental, there's no 'material' that makes them up, it's like talking about a colour itself rather than a car or a flower. Now protons and neutrons and quarks and whatever other stuff is there fundamentally, I want someone to kind of teach me these, I have a lot of questions that books might not give the answer in the way I understand. Thanks
I am attempting to use a Raman TruScan with a 785 nm laser to read a material for identification purposes. The material causes too much fluorescence and doesn’t not produce a good signal. However another lab is able to produce a good signal consistently using the same Raman model and sample material. What would be the reason for the different results between instruments?
Back
Top