Is perspective distortion mathematically modeled?

In summary, the conversation discussed the perspective distortion in vision and cameras. The use of different focal lengths lenses was mentioned as a way to control this distortion, but some argue that only the distance between the camera and object affects it. The original poster wanted to know if there is a mathematical model for perspective distortion and the relation to focal length. The response explained the concept of image magnification and how it is affected by object distance, but also mentioned that telecentric lenses have no parallax error. The original poster then clarified their confusion and their need for a mathematical form for perspective. However, it was mentioned that perspective is a qualitative phenomenon and it may be more useful to focus on calculating the distance to an object using methods like photogrammetry
  • #1
chastiell
11
0
Hi guys !

I'm trying to understand the perspective distortion that can be observed in our vision and cameras, so many pages on the web that talks about photography says that the perspective distortion can be controlled using different focal lengths lenses, meanwhile a reduced group of pages states that focal lengths doesn't have any effect in the perspective, ensuring that the only factor that can modify this effect is the distance between camera and object. here the page :
http://photography-on-the.net/forum/showthread.php?t=672913
its arguments are really good, but as in photography pages doesn't show any mathematical model that can give the end point of future discussions about the theme, so I'm asking if there is any mathematical model of perspective distortion , or any non-paraxial model of camera lenses and rays tracing that can explain perspective distortion and its relation to focal length.

Thanks in advance for your answers
 
Physics news on Phys.org
  • #2
At its basic level, this is just a matter of image magnification, which is given by image distance / object distance for a simple len. Using an example from the article you linked, let's say you are taking someone's portrait. If you are up close, the object distance of the ears is significantly larger than the object distance of the nose, on a relative percentage basis, so the size of the nose relative to the ears is exaggerated. Using a larger focal length lens will increase the magnification overall but not change the relative magnification of the nose and ears if you are in the same location.

If you step back further, now the object distances of the nose and ears become more similar, on a relative percentage basis, so the nose is no longer exaggerated.
 
  • Like
Likes Nidum
  • #3
probably I didn't mention this in my post but I want to write a program about it (for triangulation purposes in real world ,that's why I don't want paraxial models for camera lenses ), in order to achieve that I need equations not only words , or arguments like those showed in my link
 
  • #4
Your original post seemed to imply some confusion about the fundamental cause of perspective distortion and what influences it, hence my response. Now you are saying that you don't need words, but want to write a program about "it" and calculate something "for triangulation purposes" and it's not clear to me what you mean that. What exactly do you want to calculate?
 
  • #5
Sorry for that confusion and thanks for your answer , its just that my true problem is about the perspective, I understand that it's affected by distance but I don't know how, I hope that there's a known mathematical relation between magnification and distance of objects with great prediction capabilities, but I can't find a relation with that features, instead I obtained a relation before using the pinhole camera model but its predictions are not correct , at least not for distant objects , in general what I need is a mathematical form for perspective or a non-paraxial model for lenses (which can give me that relation)
 
  • #6
chastiell said:
Sorry for that confusion and thanks for your answer , its just that my true problem is about the perspective, I understand that it's affected by distance but I don't know how, I hope that there's a known mathematical relation between magnification and distance of objects with great prediction capabilities, but I can't find a relation with that features, instead I obtained a relation before using the pinhole camera model but its predictions are not correct , at least not for distant objects , in general what I need is a mathematical form for perspective or a non-paraxial model for lenses (which can give me that relation)

Unfortunately, it's not such a simple relation. Basically, the problem you are describing is referred to as "parallax error". Telecentric lenses have no parallax, and as a result the image magnification is independent of object distance:

https://www.edmundoptics.com/resources/application-notes/imaging/advantages-of-telecentricity/

Telecentric lenses place the entrance pupil at infinity to achieve this. Your eye (and most lenses) do not, and so there is parallax error/perspective distortion. I figured it would be easy to find a calculation that computes the distortion relative to human vision based on the focal length, but no luck. I had some success searching the photogrammetry literature, but nothing tidy.

https://www.robots.ox.ac.uk/~vgg/publications/1999/Criminisi99b/criminisi99b.pdf
https://www.rgs.org/NR/rdonlyres/C30C5A86-5364-49FE-9A58-58EE27315F9C/0/Chapter9GeocorrectionandPhotogrammetry.pdf
 
  • #7
chastiell said:
in general what I need is a mathematical form for perspective

As far as I know, "perspective" is not a quantitative measurement. It is a qualitative phenomena. If you need a mathematical statement about how to calculate something, what is it specifically that you wish to calculate? If your goal is caclulate the distance to an object, what data do you have about that object? - one image? - a series of images? Are you dealing with objects across the street? - galaxies that are light years away ?
 
  • #8
Stephen Tashi said:
As far as I know, "perspective" is not a quantitative measurement. It is a qualitative phenomena. If you need a mathematical statement about how to calculate something, what is it specifically that you wish to calculate? If your goal is caclulate the distance to an object, what data do you have about that object? - one image? - a series of images? Are you dealing with objects across the street? - galaxies that are light years away ?

It can be made quantitative (photogrammetry).
 
  • #9
Andy Resnick said:
Telecentric lenses place the entrance pupil at infinity to achieve this. Your eye (and most lenses) do not, and so there is parallax error/perspective distortion. I figured it would be easy to find a calculation that computes the distortion relative to human vision based on the focal length, but no luck. I had some success searching the photogrammetry literature, but nothing tidy.

Thanks you for the answer and the links, such a relation seems to don't exist :/, or at least isn't as simple as i thought
 
  • #10
Stephen Tashi said:
As far as I know, "perspective" is not a quantitative measurement. It is a qualitative phenomena. If you need a mathematical statement about how to calculate something, what is it specifically that you wish to calculate? If your goal is caclulate the distance to an object, what data do you have about that object? - one image? - a series of images? Are you dealing with objects across the street? - galaxies that are light years away ?

The perspective distortion is a change of apparent size of objects with distance, the size is a quantitative measurement and the apparent size too, distance too, the phenomena that relates those things can not be a qualitative one
 
  • #11
chastiell said:
Thanks you for the answer and the links, such a relation seems to don't exist :/, or at least isn't as simple as i thought

Yeah, I burrowed down the rabbit hole for a bit, here's what I learned:

Perspective distortion is given by the angle of the principal ray (U) in image space: for telecentric systems, the principal ray emerges parallel to the optical axis (U = 0) and thus there is no variation of magnification with image height. In human eyes, for comparison, the principal ray emerges (I think) at about a 5 degree maximum angle. Lenses with a smaller U will have less perspective distortion than human eyes, lenses with larger U will have more. It seems that longer focal length lenses have smaller values of U compared to short focal length lenses- unfortunately, Smith's "Modern Lens Design" doesn't provide values of U.

However, there is no fixed relationship between U and focal length, nor is there a clear relationship between U and location of the aperture stop. Certainly, if you have a design you can ray trace to determine U (or adjust the design to generate a particular value of U), but I couldn't find a 'coarse-grained' relationship.

And it gets worse: shifting the lens also alters U- this is one reason why tilt-shift lenses (perspective control lenses) are used. Again, if you have a design you can compute U or alternatively, if you have the lens you can measure U in the lab.

Bleah...
 
Last edited:
  • #12
chastiell said:
The perspective distortion is a change of apparent size of objects with distance, the size is a quantitative measurement and the apparent size too, distance too, the phenomena that relates those things can not be a qualitative one

Then, since you're asking about mathematical models, can you give a quantitative definition of "perspective distortion"?
 
  • #13
Stephen Tashi said:
Then, since you're asking about mathematical models, can you give a quantitative definition of "perspective distortion"?

The principal ray angle U. If U = 0 the lens is telecentric and angular magnification does not vary with object distance. If you like, I hereby define the amount of perspective distortion P = U.
 

1. What is perspective distortion?

Perspective distortion is the distortion of an image or object when it is viewed from a certain angle or perspective. It makes objects appear larger or smaller, closer or farther away, and can alter their shape or proportions.

2. How is perspective distortion mathematically modeled?

Perspective distortion is mathematically modeled using a transformation matrix, which takes into account the distance between the viewer and the object, the angle of the viewer's line of sight, and the position of the object within the frame. This matrix can then be applied to the image to correct for perspective distortion.

3. Why is it important to mathematically model perspective distortion?

Mathematically modeling perspective distortion allows for accurate correction of distortions in images, making them appear more realistic and true to life. It is especially important in fields such as computer graphics and photography, where precision and accuracy are crucial.

4. Can perspective distortion be completely eliminated?

No, perspective distortion cannot be completely eliminated as it is a natural phenomenon that occurs when viewing objects from different angles. However, it can be minimized and corrected through mathematical modeling and techniques such as using different lenses or adjusting the position of the camera.

5. How does perspective distortion affect visual perception?

Perspective distortion can affect visual perception by altering the perceived size, shape, and distance of objects in an image. It can also create a sense of depth and dimension, adding to the overall composition and impact of the image.

Similar threads

Replies
1
Views
951
  • Classical Physics
Replies
21
Views
1K
  • Mechanical Engineering
Replies
2
Views
1K
  • Beyond the Standard Models
Replies
6
Views
3K
Replies
72
Views
5K
Replies
1
Views
2K
Replies
1
Views
1K
Replies
6
Views
2K
Replies
20
Views
2K
Replies
6
Views
2K
Back
Top