Why shadows become dark when light intensity increases?

Click For Summary
SUMMARY

The discussion centers on the phenomenon of shadow darkening with increased light intensity, specifically when using a torch. Participants clarify that while the shadow appears darker, this is due to increased contrast rather than an actual increase in shadow darkness. The brain interprets the heightened contrast as a darker shadow, influenced by the visual cortex's adaptation to varying light levels. The conversation also touches on how cameras replicate this effect, with exposure settings impacting perceived brightness and shadow detail.

PREREQUISITES
  • Understanding of basic optics and light behavior
  • Familiarity with human visual perception and the role of the brain in interpreting light
  • Knowledge of camera exposure settings and image processing
  • Basic principles of contrast and brightness in visual systems
NEXT STEPS
  • Research the principles of light and shadow in physics, focusing on light intensity and contrast
  • Explore human visual perception, particularly how the brain processes light and shadow
  • Learn about camera exposure settings and how they affect image brightness and shadow detail
  • Investigate the use of photodiodes and light meters for measuring light intensity objectively
USEFUL FOR

This discussion is beneficial for physicists, photographers, visual artists, and anyone interested in the interplay between light, shadow, and human perception.

stalin
Messages
5
Reaction score
0
I observed following phenomena today: in my room with some lightings, there is a stuff toy placed in front of a wall. The ambient lighting doesn't cause any shadow on the wall currently. Now, I switched ON a torch and pointed it to a stuff toy which was kept in front of a wall and a shadow appears on the wall. Later, I increased the light intensity of the torch falling on the toy and observed that the shadow on the wall has become darker. However, I didn't observe this phenomena when I switched of all the lights of my room and repeated the experiment.

EG7fo.gif


As far as I have understood, the directed light from torch tries to cut-off the ambient light on the wall. The more the intensity of torch light is, the more ambient light gets cut-off.

I will be really grateful if somebody can formally explain this phenomena to me with all the physics laws involved, preferably with some links or illustrations on the topic.
 
Biology news on Phys.org
stalin said:
As far as I have understood, the directed light from torch tries to cut-off the ambient light on the wall. The more the intensity of torch light is, the more ambient light gets cut-off.
It does not. Just the contrast increases, which our brain perceives as "darker" shadow. Use a photodiode or some other objective measurement and you'll see the shadow does not get darker.
 
mfb said:
It does not. Just the contrast increases, which our brain perceives as "darker" shadow. Use a photodiode or some other objective measurement and you'll see the shadow does not get darker.
I will be grateful if this phenomena is explained to me formally in detail. I mean why at all this is happening and our brain is doing like that?
 
You are really asking about how the visual cortex in our brain operates. AFAIK there is no completely correct, testable model for this problem. In a very, very general persepective: our vision is adapted to be very efficient. It evolved two primary vision systems: color vision for daylight conditions, and 'black and white' low level light (night) vision. There are intermediate light conditions where both can be turned on and functioning: What you saw when there is clear vision in both shadow and light.

Have you ever walked out from a very dark room and then been blinded by daylight? This happens because the biochemistry of your eye has to change to the new light level. From only using night vision.

When there is a large disparity in light levels your eye adjusts to use the bright 'setting' only, and turns off the low level (night) system. Your brain knows the lowest signals are turned off so it fills in those areas with black. BTW most of what you see perpheraly is the same kind of fill - the brain knows there are few signals to process so it make up stuff that was there before. Ever read something on a sign intently, then look up and see that the people or cars in your field of view instantly changed? Your brain got a new dose of information for what was previously just fill-in.

Think of vision this way: your brain is creating a megapixel image using a slow input channel, so it has to fill in lots of parts of what you see. If this wasn't true magicians would be out of business, for example.

See: https://en.wikipedia.org/wiki/Night_vision
 
  • Like
Likes   Reactions: stalin
A closely related factoid is that sunspots are not really black.

https://van.physics.illinois.edu/QA/listing.php?id=28682&t=are-sunspots-really-black
http://image.gsfc.nasa.gov/poetry/workbook/sunspot.html

NASA site said:
Sunspots are actually several thousand degrees cooler than the 5,770 K surface of the Sun, and contain gases at temperature of 3000 to 4000 K. They are dark only by contrast with the much hotter solar surface. If you were to put a sunspot in the night sky, it would glow brighter than the Full Moon with a crimson-orange color!

 
  • Like
Likes   Reactions: stalin
jim mcnamara said:
You are really asking about how the visual cortex in our brain operates. AFAIK there is no completely correct, testable model for this problem. In a very, very general persepective: our vision is adapted to be very efficient. It evolved two primary vision systems: color vision for daylight conditions, and 'black and white' low level light (night) vision. There are intermediate light conditions where both can be turned on and functioning: What you saw when there is clear vision in both shadow and light.

Have you ever walked out from a very dark room and then been blinded by daylight? This happens because the biochemistry of your eye has to change to the new light level. From only using night vision.

When there is a large disparity in light levels your eye adjusts to use the bright 'setting' only, and turns off the low level (night) system. Your brain knows the lowest signals are turned off so it fills in those areas with black. BTW most of what you see perpheraly is the same kind of fill - the brain knows there are few signals to process so it make up stuff that was there before. Ever read something on a sign intently, then look up and see that the people or cars in your field of view instantly changed? Your brain got a new dose of information for what was previously just fill-in.

Think of vision this way: your brain is creating a megapixel image using a slow input channel, so it has to fill in lots of parts of what you see. If this wasn't true magicians would be out of business, for example.

See: https://en.wikipedia.org/wiki/Night_vision
What I have understood after reading other answers as well: "less brighter objects surrounded by brighter area, appear darker and if the brightness of the brighter area increases further, then the less bright objects appear more dark."

Am I right? The same thing also happens when we take photos using a camera. Is it easier to explain this if we consider camera instead of a human eye?
 
Here are guessing questions. What is brighter?

(1)
- the surface of a light bulb (like http://www.dailygreen.de/wp-content/uploads/2009/08/gluehbirne-pixelio.jpg) at night
- a white house in direct sunlight

(2) which circled square in this image?
(3) A or B in this image?

(1) It depends a bit on the parameters, but they are very similar. It does not appear so, because the light bulb surface is by far the brightest object, while the street is darker than many other surfaces seen at the same time.

2+3: they are exactly the same. Take a graphics program to move them next to each other if you don't believe me.

stalin said:
What I have understood after reading other answers as well: "less brighter objects surrounded by brighter area, appear darker and if the brightness of the brighter area increases further, then the less bright objects appear more dark."
Right.

Cameras are accurate - they record whatever they get (but their exposure time will depend on the light conditions). Humans looking at pictures are the same problem as before.
 
A single-point light source makes very sharp shadow edges. Ambient lighting contains a much higher percentage of reflected light from many directions. So the shadow edge is much less sharp. There might not be any visible shadow at all.
 
mfb said:
Here are guessing questions. What is brighter?

(1)
- the surface of a light bulb (like http://www.dailygreen.de/wp-content/uploads/2009/08/gluehbirne-pixelio.jpg) at night
- a white house in direct sunlight

(2) which circle square in this image?
(3) A or B in this image?

(1) It depends a bit on the parameters, but they are very similar. It does not appear so, because the light bulb surface is by far the brightest object, while the street is darker than many other surfaces seen at the same time.

2+3: they are exactly the same. Take a graphics program to move them next to each other if you don't believe me.

Right.

Cameras are accurate - they record whatever they get (but their exposure time will depend on the light conditions). Humans looking at pictures are the same problem as before.
Well, I used a program to check the pixel intensities of the marked portions in (2) and (3). They were exactly same! However, these images are artificially created.
If we use a camera to capture the two scene about which I am talking about, even in the images the shadow appears to get darker. It means the camera also might be doing something similar to eye! So can someone explain this phenomena in terms of camera (assuming it is easier than human eye!)
 
  • #10
stalin said:
If we use a camera to capture the two scene about which I am talking about, even in the images the shadow appears to get darker.
Did you check with a program or by eye?

Exposure time of the camera is another thing to consider.
 
  • #11
mfb said:
Did you check with a program or by eye?

Exposure time of the camera is another thing to consider.
I checked by a program also, there is a intensity difference of around ~15 (8-bit images, intensity ranging from 0 to 255). By naked eye, one can quite easily observe this.
 
  • #12
You can validate actual light levels with a light meter (some cell phones have that builtin to the camera hardware), a camera and other light sensing equipment.
You can NOT do his reliably with your eyes. Your brain enhances and plays with images, as do the light processing cells in your retina.

Example: we can 'see' a lot of colors and intermediate color values. There are three light sensing pigments in your eyes, so you should be able to see only a limited number of colors assuming that pigments are the only perceptual mechanism. They are not - the retina and the brain mess around hugely with the raw color input. Like they do with all of your vision input data.

BTW - your eyes are actually derived from the same precursor cells as your brain's precursor cells, and can be thought of as part of the brain. Some models of human vision use that concept to teach vision theory. Which is very complex.

Boy, Biology loses something when I try to make it Physics-palatable. Sorry.
 

Similar threads

Replies
8
Views
5K
  • · Replies 7 ·
Replies
7
Views
5K
  • · Replies 152 ·
6
Replies
152
Views
11K
  • · Replies 13 ·
Replies
13
Views
8K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 21 ·
Replies
21
Views
5K
  • · Replies 16 ·
Replies
16
Views
7K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 5 ·
Replies
5
Views
5K