Hi, everyone! To be honest, I don't know that this is a math question or what field of science and math can help understand this, so this is my first try to ask others on the subject. Here it goes. For a while now, I've been trying to understand visual perspective. Things get smaller as they move away, that's true, but what is the rate at which they appear smaller? One thing I felt was necessary was creating a baseline such as holding a 1 meter long object 1 meter away from my eyes. From there, you could say that 1 meter appears to be "l" meters long, where l<1 at d meters away from the baseline where d>0. The only problem with that on paper is that I realize I really have no way of establishing visual angles. If I draw out my possible baseline, I cannot determine accurately measurements or field of vision a 1 meter long object takes up. 20°, 30°, 40°? I don't know. One thing that comes out of this is calling out apparent lengths or how long something looks to be at a certain distance away. Like if you looked at the sun (with PROPER radiation blocking equipment!) it appears like a very tiny circle even though it is gargantuan. But if I hold out a penny in front of my face 2 feet away, how much smaller is the sun to the penny? I don't know really if this has any function in real life, but it seems like it has some astronomical function that may already be in place, else we couldn't be determining the incredibly massive sizes of stars that are nearly as large as our our solar system. So this is why I'm wondering if this belonged in physics or astronomy as well. - -- --- -- - -- --- -- - -- --- -- - -- --- -- - -- --- -- - -- --- -- - -- --- -- - -- --- -- - Does anyone have an idea what I'm talking about? Surely someone has attempted such concepts before me. If I need to draw any of this out on paper, let me know.