- #1
colinven
- 10
- 0
This thought came from staring at telephone poles and light posts along roads and walkways. I noticed that these objects all had the same height more or less and were regularly spaced. The poles and appeared to reduced in height (following some mathematical relationship) as the distance between me and the object increased. When I first thought about this I would hold my fingers up close to my face, and with my pointer finger and thumb encase the height of a telephone pole between my two fingers. The height between my two fingers might have only been a few centimeters (this is the height that the telephone pole seemed to be from my distance to it), but the real height of the telephone pole was much larger. I thought, how could the perceived height and real height be related; is there some way in which I could simply encase any object at some distance and by measuring the height between my fingers know the real height of the object encased?
What I found was a simple ratio: [itex]\frac{h}{l}[/itex]=[itex]\frac{H}{L}[/itex]
where h is the height between my fingers or the perceived height, l is the distance my fingers are from my eye, H is the real height of the object, and L is the distance from my fingers to the object.
Using this formula I only had to know my distance to the object, the height between my two fingers and how far my fingers were from my face in order to know the real height of any object.
The perceived height of telephone poles as the fade into the distance is proportional to [itex]1/L[/itex], or in other words the perceived height is inversely proportional to your distance from each pole. This is true if you are standing directly in line with the poles, however we know from experience that if you are standing directly in line you will not observe this telescoping series. You have to stand some distance from the line of telephone poles in order to observe this phenomenon. To account for this I found the following equation using simple geometry for right triangles: [itex]h[/itex]=[itex]\frac{Hl}{\sqrt{n^{2}d^{2}+L^{2}}}[/itex]
The above equation assumes you are standing parallel to the telephone poles and directly across from the first one in your series. Thus, L is the perpendicular distance to the first telephone pole, d is the pole spacing, and n is the number of telephone pole in the series (the first telephone pole is n=0, the second n=1,...).
The whole denominator in the above equation acts as a way to indirectly measure the distance to each successive telephone pole using the Pythagorean theorem. The above equation has some finite bound, that means as n → 0 h → [itex]\frac{H}{L}[/itex]. Whereas the first equation has no bound, that means as L → 0 h → ∞. Therefore as L → ∞ the [itex]\sqrt{n^{2}d^{2}+L^{2}}[/itex] approximates L and the difference between the two equations becomes negligible.
I offer this as a start to a discussion on how our perception forms the space around us. Is it possible that by our observation we impose restrictions on the allowed heights of an object? Does our observation create a potential curve like [itex]1/L[/itex]? Where does distance come from, us or is it apriori?
What I found was a simple ratio: [itex]\frac{h}{l}[/itex]=[itex]\frac{H}{L}[/itex]
where h is the height between my fingers or the perceived height, l is the distance my fingers are from my eye, H is the real height of the object, and L is the distance from my fingers to the object.
Using this formula I only had to know my distance to the object, the height between my two fingers and how far my fingers were from my face in order to know the real height of any object.
The perceived height of telephone poles as the fade into the distance is proportional to [itex]1/L[/itex], or in other words the perceived height is inversely proportional to your distance from each pole. This is true if you are standing directly in line with the poles, however we know from experience that if you are standing directly in line you will not observe this telescoping series. You have to stand some distance from the line of telephone poles in order to observe this phenomenon. To account for this I found the following equation using simple geometry for right triangles: [itex]h[/itex]=[itex]\frac{Hl}{\sqrt{n^{2}d^{2}+L^{2}}}[/itex]
The above equation assumes you are standing parallel to the telephone poles and directly across from the first one in your series. Thus, L is the perpendicular distance to the first telephone pole, d is the pole spacing, and n is the number of telephone pole in the series (the first telephone pole is n=0, the second n=1,...).
The whole denominator in the above equation acts as a way to indirectly measure the distance to each successive telephone pole using the Pythagorean theorem. The above equation has some finite bound, that means as n → 0 h → [itex]\frac{H}{L}[/itex]. Whereas the first equation has no bound, that means as L → 0 h → ∞. Therefore as L → ∞ the [itex]\sqrt{n^{2}d^{2}+L^{2}}[/itex] approximates L and the difference between the two equations becomes negligible.
I offer this as a start to a discussion on how our perception forms the space around us. Is it possible that by our observation we impose restrictions on the allowed heights of an object? Does our observation create a potential curve like [itex]1/L[/itex]? Where does distance come from, us or is it apriori?