How do cameras focus on reflected images?

AI Thread Summary
Modern cameras, like those with CMOS sensors, can focus on reflected images in mirrors by utilizing optical edge detecting algorithms. These algorithms identify sharp edges in the reflection, allowing the camera to determine focus accurately. The camera perceives light rays diverging from the reflection as if the object were positioned behind the mirror, rather than viewing the mirror as a flat plane. Additionally, cameras often employ an initial approximation using infrared beams before refining focus with edge detection. This combination of techniques enables effective autofocus on reflections.
GeorgeV
Messages
3
Reaction score
0
I can remember from my distant school physics days that the image in a mirror lies beyond the mirror. But how does a modern camera sensor "know" this. If I face a mirror with my canon camera which has a cmos sensor and usm lens, it autofocuses correctly on the part of the reflection I select. Why does the sensor not see the mirror as a flat plane of light with varying degrees of intensity?. I have done multiple google searches and can't find an answer.
Any light on this subject appreciated.
 
Science news on Phys.org
Automatic cameras use optical edge detecting algorithms. When the edges are sharp the image is assumed to be in focus.
 
GeorgeV said:
Why does the sensor not see the mirror as a flat plane of light with varying degrees of intensity?

The same reason we don't see the mirror as a flat plane of light. The light rays (or waves) arriving at the camera lens or our eye diverge in exactly the same way as if we had removed the mirror and placed a (spatially-reversed) copy of the object in the space behind the mirror.
 
Doug Huffman said:
Automatic cameras use optical edge detecting algorithms. When the edges are sharp the image is assumed to be in focus.
Makes sense. Thanks.
 
Doug Huffman said:
Automatic cameras use optical edge detecting algorithms. When the edges are sharp the image is assumed to be in focus.

Edge detection, to my understanding, is the "fine-tuning" step actually. Cameras often do a quick 'n dirty approximation with an infrared beam, and then follow it up with edge detection.
 
Thread 'A quartet of epi-illumination methods'
Well, it took almost 20 years (!!!), but I finally obtained a set of epi-phase microscope objectives (Zeiss). The principles of epi-phase contrast is nearly identical to transillumination phase contrast, but the phase ring is a 1/8 wave retarder rather than a 1/4 wave retarder (because with epi-illumination, the light passes through the ring twice). This method was popular only for a very short period of time before epi-DIC (differential interference contrast) became widely available. So...
I am currently undertaking a research internship where I am modelling the heating of silicon wafers with a 515 nm femtosecond laser. In order to increase the absorption of the laser into the oxide layer on top of the wafer it was suggested we use gold nanoparticles. I was tasked with modelling the optical properties of a 5nm gold nanoparticle, in particular the absorption cross section, using COMSOL Multiphysics. My model seems to be getting correct values for the absorption coefficient and...
Back
Top