Dimensions and relative magnitude

AI Thread Summary
Objects appear smaller at greater distances due to the angles formed between the viewer's eyes and the object's edges, affecting how much of the field of view they occupy. This phenomenon is distinct from Lorentz contraction, which pertains to objects moving close to the speed of light and is unrelated to size perception based on distance. The brain interprets size based on angular information received from photons, which can be misleading without contextual references. Without nearby objects for comparison, it is challenging to discern an object's true size. Ultimately, the perception of size is rooted in basic three-dimensional physics and the brain's processing of visual information.
Nano-Passion
Messages
1,291
Reaction score
0
Why is it that things appear smaller the farther they are away and bigger the closer they are away? It is almost as if the magnitude of their dimensions (length, width, and height)changes. But in reality it is still the same object.

My cup x distance closer to me than the door may appear 1/6th size of the door itself. It is not as if the object got bigger but it seems as if.

Why do things seem bigger or smaller in relative distances through space? It is interesting because things seem to change size close to the speed of light (lorentz contraction).
 
Science news on Phys.org
The two effects are not related. The difference in size due to distance is simply caused by angles. Drawing two lines from your eye to opposite sides of an object, you can see that the closer an object is, the greater the angle between the lines, meaning it takes up more of your field of view.

Lorentz contraction is the result of an object not being able to travel faster than c. I can't explain very well, but I'm sure that a search on Lorentz Contraction here on the forums would yield plenty of good results.
 
Drakkith said:
The two effects are not related. The difference in size due to distance is simply caused by angles. Drawing two lines from your eye to opposite sides of an object, you can see that the closer an object is, the greater the angle between the lines, meaning it takes up more of your field of view.

Lorentz contraction is the result of an object not being able to travel faster than c. I can't explain very well, but I'm sure that a search on Lorentz Contraction here on the forums would yield plenty of good results.

I'm aware of the triangle and angle explanation. It does not satisfy me. I feel that its too simple, there is might be more to it.

For examples, what causes you to interpret different sizes? Well first you must see the objects, for that your neurons register photons. But how does photons coming from different angles causes the object to appear larger or smaller?

Is this just simple three dimensional physics or are there other implications of it...
 
Last edited:
The apparent linear size of the object is relative to the background or other objects near by.
Your eye only produces information about the angular size of the object. When there is no reference around you cannot tell if you see a big object far away or a small one much closer.
So there is no direct information about the linear size of the objects in the raw signal from the eyes and even less in the photons. It's all in the software (and training of it).
 
Is this just simple three dimensional physics or are there other implications of it...

I would say simple 3d physics.
 
Thread 'A quartet of epi-illumination methods'
Well, it took almost 20 years (!!!), but I finally obtained a set of epi-phase microscope objectives (Zeiss). The principles of epi-phase contrast is nearly identical to transillumination phase contrast, but the phase ring is a 1/8 wave retarder rather than a 1/4 wave retarder (because with epi-illumination, the light passes through the ring twice). This method was popular only for a very short period of time before epi-DIC (differential interference contrast) became widely available. So...
I am currently undertaking a research internship where I am modelling the heating of silicon wafers with a 515 nm femtosecond laser. In order to increase the absorption of the laser into the oxide layer on top of the wafer it was suggested we use gold nanoparticles. I was tasked with modelling the optical properties of a 5nm gold nanoparticle, in particular the absorption cross section, using COMSOL Multiphysics. My model seems to be getting correct values for the absorption coefficient and...
Back
Top