Why is visible light invisible?

AI Thread Summary
Visible light is perceived when it interacts with objects, such as a blue floor that reflects blue wavelengths while absorbing others, making it visible to the eye. Light beams are generally invisible unless they scatter, as seen with dust or when directed into the eye, which explains why a laser appears as a dot rather than a beam. The perception of depth can occur with one eye through motion, as distant objects appear to move less than closer ones. The human brain interprets visual information, allowing for depth perception even in flat images, unlike some animals that struggle with this concept. Overall, the visibility of light depends on its interaction with surfaces and the conditions of the environment.
Science news on Phys.org
Hi,

You will see the white light if you are looking at the light source. The white arrow would be shooting into your eye. But if you are looking at the floor, you will see blue, because the blue arrows are shooting into your eye. If there is no air or dust to scatter any of the light, and you look elsewhere (not at the source and not at the floor), you will see black.

You cannot see a beam of light from the side unless it scatters. Suppose I shoot a laser across the room, it is invisible to you unless I give it some dust so that the light scatters into different directions (which some of them becomes oriented to your eye), or that I shoot the laser directly into your eye. That is why when people use a laser pointer, you don't see a beam of light like a Jedi saber, but you see the dot when it hits the wall.
 
Last edited:
Thanks that's helpful, but if reflected or incident rays of light are shining directly into our eyes and that allows us to see them, then how does the object/light source appear at a distance? I know two eyes allow us to perceive depth but why is there a space that appears void of something visible if a constant stream of light is reflecting straight into our eyes?
 
You don't need to have two eys to perceive depth. You could have just one eye, but when you move sideway, things that are far appear to be not moving.

Some animals cannot comprehend a 2D projection of the 3D world. They cannot understand a photographs. It is the human brain that let's you understand depth when you watch TV (a light source of different colors displayed from a flat surface). Some animals would not understand it.

Actually I don't quite understand what you are asking. :smile:
 
Spenakis said:
I know two eyes allow us to perceive depth but why is there a space that appears void of something visible if a constant stream of light is reflecting straight into our eyes?
What space? As said, if a stream of photons (light) is hitting your eyes, you see them.
 
Spenakis said:
Why can we see the blue floor but not the incident and reflected rays of light?
Because the blue floor absorbs the other colors from the light, only reflecting blue light. If the floor reflect some of the other colors it will appear somewhat as a "blue tinted" mirror. If the light shown at the flow didn't include any "blue" light, such as "red" or "green" light (near monochromatic, or at least not a blend of colors), the blue floor would appear to be black.

Reflection of light is "subtractive", colors are removed during the reflection process. Light itself, such as a CRT, is additive, the colors are combined and stimulate all 3 or 4 color receptors in the eye (most humans have 3, some have 4), causing the combined colors to appear the same as a single color that would affect the receptors in the same ratio (except that some perceived colors can't be caused by a single frequency of light).

http://en.wikipedia.org/wiki/Color
 
Thread 'A quartet of epi-illumination methods'
Well, it took almost 20 years (!!!), but I finally obtained a set of epi-phase microscope objectives (Zeiss). The principles of epi-phase contrast is nearly identical to transillumination phase contrast, but the phase ring is a 1/8 wave retarder rather than a 1/4 wave retarder (because with epi-illumination, the light passes through the ring twice). This method was popular only for a very short period of time before epi-DIC (differential interference contrast) became widely available. So...
I am currently undertaking a research internship where I am modelling the heating of silicon wafers with a 515 nm femtosecond laser. In order to increase the absorption of the laser into the oxide layer on top of the wafer it was suggested we use gold nanoparticles. I was tasked with modelling the optical properties of a 5nm gold nanoparticle, in particular the absorption cross section, using COMSOL Multiphysics. My model seems to be getting correct values for the absorption coefficient and...
Back
Top