The eye and pixels on a television screen

In summary: But the problem here is that you have picked out the example where the yellow light is mixed with red and green, which is not what happens in the real world. In the real world, yellow light is emitted by a yellow light bulb, and the reflected light looks yellow to our eyes.In summary, different colors are perceived when light is reflected off of different surfaces.
  • #1
entropy1
1,230
71
How is it possible that when one watches a color television screen, one perceives the three base colors of the pixels as if it were a mix tone of these base frequencies, proportional to the relative intensity of the different pixels? It puzzles me! Hope someone knows how this is possible.
 
Biology news on Phys.org
  • #2
Your eye doesn't see the visible spectrum, at least not in the sense that your ear hears the auditory spectrum. Your ear has tens of thousands of tiny hairs that vibrate in response to a fairly narrow part of the sound spectrum. Your eye has three kinds of cones. While the three types of cones do have distinct peak response areas (roughly, red, green, and blue), the frequency responses overlap a great deal. As a result, we don't see colors to anywhere close to the fidelity that we hear sound frequencies.

Example: Aim a yellow laser at a white wall and you will see a yellow spot on the wall. Even though our eyes don't have a detector for "yellow" light, we see "yellow" because the yellow light excites both the cones that detect low frequency (reddish) and mid frequency (greenish) light. The exact same response can be elicited by instead aiming a red laser and a green laser at the wall. The reflected light looks "yellow" to us even though there is no yellow component in that reflected light.
 
  • #3
D H said:
[..]
The reflected light looks "yellow" to us even though there is no yellow component in that reflected light.

So if I understand you correctly, yellow light excites both red en green cones, as in red light combined with green light, while in the latter case there is no 'yellow' light?
 
  • #4
Correct.

In the example of a spot on a wall illuminated by red and green lasers, there is no yellow frequency component to the reflected light. The only components are red and green. The same applies when you see some color that looks yellow on your computer monitor. There is no yellow frequency component to that light, yet it appears to be yellow to your eyes.
 
  • #5
D H said:
Your eye doesn't see the visible spectrum, at least not in the sense that your ear hears the auditory spectrum. Your ear has tens of thousands of tiny hairs that vibrate in response to a fairly narrow part of the sound spectrum. Your eye has three kinds of cones. While the three types of cones do have distinct peak response areas (roughly, red, green, and blue), the frequency responses overlap a great deal. As a result, we don't see colors to anywhere close to the fidelity that we hear sound frequencies.

Example: Aim a yellow laser at a white wall and you will see a yellow spot on the wall. Even though our eyes don't have a detector for "yellow" light, we see "yellow" because the yellow light excites both the cones that detect low frequency (reddish) and mid frequency (greenish) light. The exact same response can be elicited by instead aiming a red laser and a green laser at the wall. The reflected light looks "yellow" to us even though there is no yellow component in that reflected light.

Why do you consider that different from playing say a sound containing 200, 300 and 400 Hz tones and hearing a sound that has the same pitch as a 100 Hz tone?
 
Last edited:
  • #6
atyy said:
Why do you consider that different from playing say a sound containing 200, 300 and 400 Hz tones and hearing a sound that has the same pitch as a 100 Hz tone?
I would consider that to be fundamentally different from seeing the yellow in response to a combination of red and green light versus seeing yellow in response to a yellow light.

The yellow responses happen at the lowest level of seeing, the responses of the cones to the incoming light. There is no difference at the lowest level of seeing between a pure spectral yellow and an equivalent combination of pure spectral red + pure spectral green. Now consider your example of hearing the missing fundamental. Hearing the missing fundamental is a high-level response. It happens in the brainstem and in the brain, not in the inner ear.

Note that the yellow that we perceive in response to a red laser and green laser pointed at the same spot on a wall is intermediate in frequency between that of the red and green sources. In comparison, that missing fundamental we hear is lower in frequency than are any of the sources tones. Another difference: We still hear those 200, 300, and 400 Hz tones. We don't see red+green. We see yellow.
 
  • #7
D H said:
I would consider that to be fundamentally different from seeing the yellow in response to a combination of red and green light versus seeing yellow in response to a yellow light.

But the problem here is that you have picked out the case of yellow which is precisely the one where colour experience is constructed from the weaker responses of two receptors. The other three primaries are identified with the peak responses of "their" three receptors. Yellow is the primary without its own receptor.

So your general point - colour experience is constructed from the mixture of activity - is correct. But yellow is the example which exaggerates the case. It is not the typical story.

Primate colour vision only has three cones, but needs four primaries to allow opponent channel processing. Red had green. But blue needed the invention of yellow. And that happens by the neural mixing of red and green responses.

The essential point is important though - vision samples the world in a remarkably narrow way. Just three cone pigments to conjure up a million hues. Whereas hearing is more a continuous map of its frequency spectrum.

This seems important for the psychology of perception - the contrasting nature of the qualia. All sounds sound much the same (when it comes to pure notes). But all colours seem strongly different (even as pure frequencies).

This would seem to be directly because colours are much more "invented".

It's an interesting question whether any kind of opponent channel process even applies in hearing. It must in some form, given the brain relies on creating just such perceptual contrasts. But I don't think it applies in such a fundamental way as it does in colour vision - which is more invented right from the start.
 
  • #8
entropy1 said:
How is it possible that when one watches a color television screen, one perceives the three base colors of the pixels as if it were a mix tone of these base frequencies, proportional to the relative intensity of the different pixels? It puzzles me! Hope someone knows how this is possible.

Missing from the discussion is the fact that your eye does not resolve the individual pixels. If it did (for example, watching TV through a magnifying lens), you would indeed see each individual pixel, just like looking at a magnified color print reveals the individual dots of (discrete) colored ink.
 
  • #9
Andy Resnick said:
Missing from the discussion is the fact that your eye does not resolve the individual pixels. If it did (for example, watching TV through a magnifying lens), you would indeed see each individual pixel, just like looking at a magnified color print reveals the individual dots of (discrete) colored ink.
That is probably exactly what I mean: On what level and how are the colors blended together?
 
  • #10
Your lens isn't perfect; it can't be because it's size is finite. This means that pure point source will not focus to a point on the retina. The image will instead be a spot with a non-zero size. Pack a bunch of near-point sources together and your eye will not be able to distinguish the individual sources.
 
  • #11
entropy1 said:
That is probably exactly what I mean: On what level and how are the colors blended together?

You want a simple answer to a complex question. And first you have to separate the issue of optic resolution from colour perception.

Speaking crudely, colour perception takes place in V4, a higher cortical region of the brain. It is not until you go a long way up the processing hierarchy that the responses of neurons correlates with perceived hue rather than wavelength information.

The brain in fact is discounting actual wavelength information by this stage, as illustrated by the Land effect - http://en.wikipedia.org/wiki/Color_constancy

Your eyes see one thing, and your brain corrects so you see the hue that is "really there". :wink:

Resolution is a second issue - at what point does vision fuse/discriminate these "pixels of information". And again the answer is as much psychological as mechanical. See for example phi effects - http://en.wikipedia.org/wiki/Phi_phenomenon

Visual experience is surely the single most complicated process in the known universe. In your first five minutes of studying psychophysics, you should learn that the eye is just like a camera. Then you should spend the rest of your career understanding all the ways it is in fact not.

But essentially, the blending of wavelength information is not a matter of shedding information (mechanical blurring due to lossy resolution) but the synthesis of a rich experience from a surprisingly narrow sampling of the available information.

And it is a beautiful case of less is more.

Well, the simplest case is single cone vision – monochromacy – which gives us 200 shades of gray. We can distinguish that many luminance levels.

And two is better. Dichromacy, employing a long wave and short wave cone, swells our visual experience geometrically to about 10,000 distinguishable shades.

But three gives us a vast range of easily discriminated hues. Trichromacy, the addition of a third red-green opponent channel, multiplies the total number of shades to several million.
 
  • #12
And just to stir the pot some, :tongue:, like Aperion was saying how complex it is. It gets even more complex where women are concerned (familiar themes and all that, :biggrin:, no offense ladies), because the genes for green and red cones are carried on the X chromosome-It is possible for the ladies to have multiple green/red genes each with different peak absorbances. In effect, it is possible for women (some) to have much greater discriminatory prowess where colors are concerned.

By the same unfortunate placement of pigment genes, this is also what leads to red-green colorblindness being so common in males :cry:

So the next time your lady tells you "that's not orange, its fulvous!" fellas, you should listen :rofl:
 

1. How do pixels work on a television screen?

Pixels on a television screen are tiny dots of color that combine to create an image. Each pixel is made up of three subpixels, one red, one green, and one blue. These subpixels can be turned on or off to create different colors and shades, which make up the image on the screen.

2. How many pixels are on a television screen?

The number of pixels on a television screen can vary depending on the resolution of the screen. However, a standard 1080p HD television typically has over 2 million pixels, while a 4K television can have over 8 million pixels.

3. How does the eye perceive pixels on a television screen?

The eye perceives pixels on a television screen by detecting the different colors and shades that make up the image. The retina in the eye contains millions of light-sensitive cells called rods and cones, which work together to create an image from the light that enters the eye.

4. Why do pixels appear as squares on a television screen?

Pixels appear as squares on a television screen because they are arranged in a grid pattern, with each pixel representing a small square on the screen. This grid pattern is necessary for creating a clear and detailed image on the screen.

5. Can the number of pixels on a television screen affect the quality of the image?

Yes, the number of pixels on a television screen can significantly affect the quality of the image. The more pixels a screen has, the higher the resolution and the clearer and more detailed the image will be. Higher resolution screens are especially important for larger screens, as the image can become blurry or pixelated if there are not enough pixels to fill the screen.

Similar threads

Replies
32
Views
2K
  • Electromagnetism
Replies
20
Views
1K
  • Computing and Technology
Replies
3
Views
314
  • Biology and Medical
Replies
1
Views
1K
  • Classical Physics
Replies
21
Views
906
Replies
16
Views
1K
Replies
26
Views
9K
  • Science Fiction and Fantasy Media
Replies
2
Views
2K
  • Astronomy and Astrophysics
7
Replies
226
Views
11K
Back
Top