Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The eye and pixels on a television screen

  1. Dec 11, 2011 #1
    How is it possible that when one watches a color television screen, one perceives the three base colors of the pixels as if it were a mix tone of these base frequencies, proportional to the relative intensity of the different pixels? It puzzles me! Hope someone knows how this is possible.
     
  2. jcsd
  3. Dec 11, 2011 #2

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Your eye doesn't see the visible spectrum, at least not in the sense that your ear hears the auditory spectrum. Your ear has tens of thousands of tiny hairs that vibrate in response to a fairly narrow part of the sound spectrum. Your eye has three kinds of cones. While the three types of cones do have distinct peak response areas (roughly, red, green, and blue), the frequency responses overlap a great deal. As a result, we don't see colors to anywhere close to the fidelity that we hear sound frequencies.

    Example: Aim a yellow laser at a white wall and you will see a yellow spot on the wall. Even though our eyes don't have a detector for "yellow" light, we see "yellow" because the yellow light excites both the cones that detect low frequency (reddish) and mid frequency (greenish) light. The exact same response can be elicited by instead aiming a red laser and a green laser at the wall. The reflected light looks "yellow" to us even though there is no yellow component in that reflected light.
     
  4. Dec 11, 2011 #3
    So if I understand you correctly, yellow light excites both red en green cones, as in red light combined with green light, while in the latter case there is no 'yellow' light?
     
  5. Dec 12, 2011 #4

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Correct.

    In the example of a spot on a wall illuminated by red and green lasers, there is no yellow frequency component to the reflected light. The only components are red and green. The same applies when you see some color that looks yellow on your computer monitor. There is no yellow frequency component to that light, yet it appears to be yellow to your eyes.
     
  6. Dec 12, 2011 #5

    atyy

    User Avatar
    Science Advisor

    Why do you consider that different from playing say a sound containing 200, 300 and 400 Hz tones and hearing a sound that has the same pitch as a 100 Hz tone?
     
    Last edited: Dec 12, 2011
  7. Dec 12, 2011 #6

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    I would consider that to be fundamentally different from seeing the yellow in response to a combination of red and green light versus seeing yellow in response to a yellow light.

    The yellow responses happen at the lowest level of seeing, the responses of the cones to the incoming light. There is no difference at the lowest level of seeing between a pure spectral yellow and an equivalent combination of pure spectral red + pure spectral green. Now consider your example of hearing the missing fundamental. Hearing the missing fundamental is a high-level response. It happens in the brainstem and in the brain, not in the inner ear.

    Note that the yellow that we perceive in response to a red laser and green laser pointed at the same spot on a wall is intermediate in frequency between that of the red and green sources. In comparison, that missing fundamental we hear is lower in frequency than are any of the sources tones. Another difference: We still hear those 200, 300, and 400 Hz tones. We don't see red+green. We see yellow.
     
  8. Dec 12, 2011 #7

    apeiron

    User Avatar
    Gold Member

    But the problem here is that you have picked out the case of yellow which is precisely the one where colour experience is constructed from the weaker responses of two receptors. The other three primaries are identified with the peak responses of "their" three receptors. Yellow is the primary without its own receptor.

    So your general point - colour experience is constructed from the mixture of activity - is correct. But yellow is the example which exaggerates the case. It is not the typical story.

    Primate colour vision only has three cones, but needs four primaries to allow opponent channel processing. Red had green. But blue needed the invention of yellow. And that happens by the neural mixing of red and green responses.

    The essential point is important though - vision samples the world in a remarkably narrow way. Just three cone pigments to conjure up a million hues. Whereas hearing is more a continuous map of its frequency spectrum.

    This seems important for the psychology of perception - the contrasting nature of the qualia. All sounds sound much the same (when it comes to pure notes). But all colours seem strongly different (even as pure frequencies).

    This would seem to be directly because colours are much more "invented".

    It's an interesting question whether any kind of opponent channel process even applies in hearing. It must in some form, given the brain relies on creating just such perceptual contrasts. But I don't think it applies in such a fundamental way as it does in colour vision - which is more invented right from the start.
     
  9. Dec 13, 2011 #8

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    Missing from the discussion is the fact that your eye does not resolve the individual pixels. If it did (for example, watching TV through a magnifying lens), you would indeed see each individual pixel, just like looking at a magnified color print reveals the individual dots of (discrete) colored ink.
     
  10. Dec 13, 2011 #9
    That is probably exactly what I mean: On what level and how are the colors blended together?
     
  11. Dec 13, 2011 #10

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Your lens isn't perfect; it can't be because it's size is finite. This means that pure point source will not focus to a point on the retina. The image will instead be a spot with a non-zero size. Pack a bunch of near-point sources together and your eye will not be able to distinguish the individual sources.
     
  12. Dec 13, 2011 #11

    apeiron

    User Avatar
    Gold Member

    You want a simple answer to a complex question. And first you have to separate the issue of optic resolution from colour perception.

    Speaking crudely, colour perception takes place in V4, a higher cortical region of the brain. It is not until you go a long way up the processing hierarchy that the responses of neurons correlates with perceived hue rather than wavelength information.

    The brain in fact is discounting actual wavelength information by this stage, as illustrated by the Land effect - http://en.wikipedia.org/wiki/Color_constancy

    Your eyes see one thing, and your brain corrects so you see the hue that is "really there". :wink:

    Resolution is a second issue - at what point does vision fuse/discriminate these "pixels of information". And again the answer is as much psychological as mechanical. See for example phi effects - http://en.wikipedia.org/wiki/Phi_phenomenon

    Visual experience is surely the single most complicated process in the known universe. In your first five minutes of studying psychophysics, you should learn that the eye is just like a camera. Then you should spend the rest of your career understanding all the ways it is in fact not.

    But essentially, the blending of wavelength information is not a matter of shedding information (mechanical blurring due to lossy resolution) but the synthesis of a rich experience from a surprisingly narrow sampling of the available information.

    And it is a beautiful case of less is more.

    Well, the simplest case is single cone vision – monochromacy – which gives us 200 shades of gray. We can distinguish that many luminance levels.

    And two is better. Dichromacy, employing a long wave and short wave cone, swells our visual experience geometrically to about 10,000 distinguishable shades.

    But three gives us a vast range of easily discriminated hues. Trichromacy, the addition of a third red-green opponent channel, multiplies the total number of shades to several million.
     
  13. Dec 13, 2011 #12

    bobze

    User Avatar
    Science Advisor
    Gold Member

    And just to stir the pot some, :tongue:, like Aperion was saying how complex it is. It gets even more complex where women are concerned (familiar themes and all that, :biggrin:, no offense ladies), because the genes for green and red cones are carried on the X chromosome-It is possible for the ladies to have multiple green/red genes each with different peak absorbances. In effect, it is possible for women (some) to have much greater discriminatory prowess where colors are concerned.

    By the same unfortunate placement of pigment genes, this is also what leads to red-green colorblindness being so common in males :cry:

    So the next time your lady tells you "that's not orange, its fulvous!" fellas, you should listen :rofl:
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: The eye and pixels on a television screen
  1. The eye (Replies: 1)

  2. The Eye (Replies: 2)

  3. Animal eyes (Replies: 7)

Loading...