Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

B In night vision equipment, how does 'information' transfer?

  1. Aug 10, 2018 #1
    So I read a couple of explanations of how night-vision equipment works. Unfortunately I don't remember the sources but my recollection/understanding is that infrared light is converted into electrons and then in to visible light. So my question is how does the information by which I mean (the shapes, the visible movement and the shades) transfer between two different though as I understand it linked particles ,the photon the electron?
     
  2. jcsd
  3. Aug 10, 2018 #2

    davenn

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    Pretty much the same way it does in any other digital imaging.
    Light falls on the image sensor the resulting signal is digitised and then can be saved or reproduced on the cameras rear LCD panel

    do some googling on how a digital camera works

    BTW unless your are really a university post graduate, I don't think your thread deserves an A tag :wink:

    and if your are a post grad. then you should be well used to doing research on a subject :smile:

    Dave
     
  4. Aug 10, 2018 #3

    phinds

    User Avatar
    Gold Member

    what he said (very small).jpg
     
  5. Aug 10, 2018 #4
    I was unaware that digital cameras and night vision were the same concept, and i'll admit that I don't really understand how digital cameras work so I'm unlikely to gravitate towards them very quickly. I filled out my educational history on my profile so you could quite easily read that there. Because the concept seemed a lot more complicated to me than it actually is I presumed it was at a higher level than it was.
     
  6. Aug 10, 2018 #5

    phinds

    User Avatar
    Gold Member

    Yes but you selected A for the subject level. We assume you know what you are doing when you make that selection and thus there is no reason for us to look at your profile. I see that you're new so I'm guessing that you are not yet used to the levels.
     
  7. Aug 10, 2018 #6

    Drakkith

    User Avatar
    Staff Emeritus
    Science Advisor

    Essentially all digital imaging sensors use similar concepts. They absorb incoming photons and use that energy to make electrons do something. The details of how each type of sensor works are slightly different, but the basic idea is the same.

    There are two basic pieces for any imaging device. You have the optics portion, and you have the sensor portion. The optics portion for every device is virtually the same. A series of optical components (lenses and/or mirrors) gather incoming light from the target region and focuses it so that light from any single point in object space (the region you're looking at with the optics) is concentrated at a single point on the image plane (or focal plane). This collection of points containing focused light is called an 'image'.

    The light then falls onto a sensing device placed at the image plane and is absorbed. The shape, color, and all other information you can gather from an image is contained in the image (though your sensor may not be able to record all of this different information at once). The sensor then takes that light and uses its energy to do something else. For most digital sensors, a small pixel measures the intensity of light falling on it. Fitting together the measurements from millions of pixels gives you a 'map' of the image. By 'map' I mean that the original information has been taken from the image and transferred to another medium, in this case a large array of intensity measurements, just like a map of a city contains the information about the position and layout of buildings, roads, and other features you would encounter if you went to that city. You can think of a digital image as a large excel spreadsheet with each cell holding a number that indirectly represents the intensity of the light in that region of the image. A traditional photograph from film holds the information in the layout of the dyes used to make the image.

    The exact mechanisms that take the information contained in the light and move it over to electrons in a digital sensor varies, like I said above. But the most common sensors are CCD and CMOS sensors. I'll post a link to both of these devices below, along with a few other links that may help. Infrared sensors often use something called an image intensifier, which works a little differently from CCD and CMOS sensors.

    Here you are:

    https://en.wikipedia.org/wiki/Image_sensor
    https://en.wikipedia.org/wiki/Digital_imaging
    https://en.wikipedia.org/wiki/Charge-coupled_device
    https://en.wikipedia.org/wiki/Active_pixel_sensor
    https://en.wikipedia.org/wiki/Image_intensifier
    https://www.explainthatstuff.com/hownightvisionworks.html
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted