Cameras detecting Infrared and making it visible

In summary, an infrared camera captures images that are visible to the viewer because of the photocurrents that are produced when infrared radiation is absorbed.
  • #1
DaTario
1,039
35
Hi All,

It is known that ordinary cameras can make infrared radiation which comes from remote controls visible at their own display. My question is about the explanation of this fact. Is the electronic process simple to describe? It seems that the infrared can produce photocurrents which are involved in the production of the RGB intensities.

Best wishes,

DaTario
 
  • Like
Likes OmCheeto
Science news on Phys.org
  • #2
Instead of using an array of silicon photodiodes, an infrared camera will typically use an array of InSb (Indium antimonide) photodiodes, because silicon does not respond in the infrared (except in the near IR, where most thermal sources do not radiate appreciably until they become very hot.) InSb responds to the mid IR. The electronic process for both types of photodiodes is nearly the same. For longer IR wavelengths, HgCdTe (mercury cadmium telluride) photodiodes are sometimes used. There are a couple other IR photodiodes that can be used, (Ge, InAs, etc.), but the ones above are the most common.
 
  • #3
Ok. But I would like to know how the infrared radiation is transported to the visible light inside the electronic apparatus.
 
  • #4
DaTario said:
Ok. But I would like to know how the infrared radiation is transported to the visible light inside the electronic apparatus.
It isn't 'converted' to visible light. The infrared radiation consists of photons of a lower frequency than visible light. Assuming these IR sensors are similar to regular CCD sensors, these photons are absorbed inside each pixel on the sensor and excite electrons from one area of the pixel to another. These electrons are then "read out" and the voltage/current amplified to produce a discrete value for each pixel. This value is what is ultimately used to produce the corresponding image, with lighter areas consisting of more electron counts than darker areas. An image is then created on a screen where the brightness of each pixel in the image roughly corresponds to the electron count.

The process is obviously more complicated, but that's the basic idea. At least for CCD and CCD-like sensors. Depending on the frequency range you're interested in, your sensor may operate differently than a CCD sensor.
 
  • Like
Likes Charles Link and OmCheeto
  • #5
DaTario said:
Hi All,

It is known that ordinary cameras can make infrared radiation which comes from remote controls visible at their own display. My question is about the explanation of this fact. Is the electronic process simple to describe? It seems that the infrared can produce photocurrents which are involved in the production of the RGB intensities.

Best wishes,

DaTario

Silicon, the material commonly used to make photodetectors in cameras, absorbs light out to about 1 micron wavelength- beyond the visible portion of the spectrum. Similarly, the Bayer filters used by manufacturers to create color images may allow some long wavelength light to pass. Remote controls typically use diodes that emit light around 880 nm or 950 nm, so that light can be detected by cameras. Because of this, most camera manufacturers add a 'short pass' filter (or 'IR cut filter') to the sensor chip that reflects wavelengths longer than about 750 nm.

Going further out into the IR: mid-wave (3-5 microns), and long-wave (8-12 microns) requires different materials for photodetectors: InGaAs, HgCdTe and PtSi are some common ones. They also have to be cooled...
 
  • Like
Likes OmCheeto and Charles Link
  • #6
It may be worth mentioning that the creation of the image by the "focal plane array" of photodiodes in any camera is a simple optical process where a converging lens is placed in front of the array and an image (of the scene being observed) comes to focus in the plane in which the array of photodiodes(each photodiode is a pixel) resides. If the scene is faraway, the scene gets focussed at the focal length of the lens. For closer object distance ## b ##, it is often necessary to refocus the lens. You may have seen the formula ## \frac{1}{f}=\frac{1}{b}+\frac{1}{m} ## which gives the object distance ## b ## and image distance ## m ## for which focusing occurs (instead of being placed at a distance ## f ##, the array of photodiodes needs to be placed a distance ## m ## from the lens.) The photodiodes typically respond with an amount of photocurrent that is proportional to the incident energy/power on a given pixel=thereby, it is very easy to create a black and white type image that looks just like the scene being observed=and in the case of the infrared that will show contrast depending on the temperature and emissivity, (basically the radiated IR) of the different regions of the scene. ## \\ ## Perhaps it is also worthwhile to mention that, as with any camera, if you can read the pixels quick enough, you can make video from the images, where the frame rate is typically 60 Hz for video.
 
Last edited:
  • #7
I very much appreciate all the contributions. With regard to:
"It isn't 'converted' to visible light." from Drakkith, I see no reason why the word conversion should be avoided, since we are facing a process where energy enters in infrared and appears as visible light at the end. It is certainly not the case of the "up conversion" as mentioned specifically in non linear optics (or quantum optics). However, I must add that I have used the vague word "transported" so as to avoid forcing the answer in any direction. Is there three photodiodes in these arrays, one for each one of the RGB scheme? Are each one of them covered with filters so as to respond basically to their particular range?
Example: one very small photodiode in a CCD of a smartphone is the red photodiode, which has a filter that only allow small frequencies of the visible spectrum to pass. Other very small is a green one and there is also the blue one.

Thank you all,

Best Regards,

DaTario
 
  • Like
Likes Charles Link
  • #9
Most applications probably use false color. Just mapping different intensities to different colors.
 
  • Like
Likes Charles Link
  • #10
Khashishi said:
Most applications probably use false color. Just mapping different intensities to different colors.

for a specialised FLIR camera, yes that is true,
but in the case of the OP using an ordinary camera, this isn't the situation
 
  • #11
davenn said:
for a specialised FLIR camera, yes that is true,
but in the case of the OP using an ordinary camera, this isn't the situation
The simplest camera IR camera is just going to give black and white=basically gray levels that vary with intensity. Especially with digital type cameras, these can easily be programmed for "false" color, as @Khashishi pointed out. (e.g. high intensities could be given a blue color, and the highest intensities could be a bright blue, while less intense regions could be made red and very low intensities could be made a dull red, etc. This is an example of a "false" color display that could easily be programmed into the data from the pixels.) Only in a rather sophisticated IR camera would there be any sorting of colors to correspond to actual IR wavelengths or groups of wavelengths. That type of camera would normally require multiple detector arrays with different IR filters in front of each detector array to determine how much energy is at various groups of wavelengths.
 
  • #12
Charles Link said:
The simplest camera IR camera is just going to give black and white=basically gray levels that vary with intensity. Especially with digital type cameras, these can easily be programmed for "false" color, as @Khashishi pointed out. (e.g. high intensities could be given a blue color, and the highest intensities could be a bright blue, while less intense regions could be made red and very low intensities could be made a dull red, etc. This is an example of a "false" color display that could easily be programmed into the data from the pixels.) Only in a rather sophisticated IR camera would there be any sorting of colors to correspond to actual IR wavelengths or groups of wavelengths. That type of camera would normally require multiple detector arrays with different IR filters in front of each detector array to determine how much energy is at various groups of wavelengths.
you are not telling me anything I don't already know
you seem to be missing the whole point of the original post and along with others making the explanation more complex than what the OP was wanting or needing

Dave
 
  • Like
Likes nasu
  • #13
  • Like
Likes davenn
  • #14
Hi all, posting this here instead of starting a new thread as it looks like this OP is somewhat similar to my question:

If you push buttons and then look at the business end of a TV remote control, you will see nothing from the LED's at the end (IR is invisible to our eyes). However, if you look at those same LED's through a digital camera (such as an older iPhone, not all cameras will work for this trick) you will see the LED's flicker when buttons are pressed on the remote (the IR light has been made visible to our eyes). Notice I mentioned an older iPhone. This used to work on my older phone but with my iPhone 6 it does not work. I used to work in the AV industry and I would use this trick to test IR outputs of control systems to make sure they were actually outputting signal.

Question:
Is it simply that the camera on the phone was just barely sensitive to IR or is there another interaction between the LED's and the optical sensor on the camera that allowed me to see the IR signal using the camera?

Thanks in advance!
 
  • #15
Stickman76 said:
Is it simply that the camera on the phone was just barely sensitive to IR or is there another interaction between the LED's and the optical sensor on the camera that allowed me to see the IR signal using the camera?
I don't know the wavelengths involved but most cameras are sensitive to the upper end of the IR range and are equipped with IR blocking filters to prevent that from distorting colors in photos.
 
  • Like
Likes davenn
  • #16
Interesting. So it's possible that the phone's camera is just barely sensitive to the upper range of IR and maybe the IR filters were not as good on the old detectors? That would explain why the old phone would do the trick but not the new ones. Thank you!
 
  • #17
A standard CMOS or CCD sensor can easily detect the IR signal coming from a remote control LED. Cheap cameras don't block IR wavelengths, thus you can use the phone's camera as an IR sensor. More sophisticated cameras put an IR blocking filter on the R pixel to avoid saturation. Perhaps, newer Iphone models use such a filter.
 
  • Like
Likes Stickman76
  • #18
Thanks Gordianus! Now it all makes sense. I doubted it could be some funky interaction and this is a straightforward explanation that certainly works.
 

1. How do cameras detect infrared?

Cameras use sensors called charge-coupled devices (CCDs) to detect infrared light. These sensors are made up of many tiny pixels that convert light into electrical signals. When infrared light hits the sensor, it creates an electrical charge that is registered as an image by the camera.

2. Can all cameras detect infrared?

No, not all cameras have the capability to detect infrared light. Only specialized cameras with infrared filters or sensors are able to capture infrared images. Regular digital cameras do not have this capability.

3. How are infrared images made visible?

Infrared images are made visible by using a process called false color mapping. In this process, the infrared light is assigned a color and then superimposed on a visible light image. This allows us to see the invisible infrared light as a color image.

4. What are some practical applications of cameras detecting infrared?

Cameras detecting infrared have many practical applications, including night vision, thermal imaging, medical imaging, and security and surveillance. They are also used in astronomy to capture images of objects that emit infrared light, such as stars and galaxies.

5. Are there any potential health risks associated with infrared imaging?

In general, infrared imaging is considered safe for humans. However, prolonged exposure to high levels of infrared radiation can cause damage to the eyes and skin. It is important to use protective gear and limit exposure when working with infrared cameras to prevent any potential health risks.

Similar threads

Replies
3
Views
813
Replies
5
Views
2K
  • Optics
Replies
7
Views
2K
Replies
152
Views
5K
Replies
2
Views
4K
  • Atomic and Condensed Matter
Replies
4
Views
987
Replies
4
Views
3K
Replies
24
Views
6K
Replies
11
Views
2K
Replies
7
Views
2K
Back
Top