Bayer Filter Effects: Image Comparisons

In summary: So basically, when imaging with a Bayer filter camera, you should use low-magnification, high numerical aperture lenses to generate artifacts, or use monochrome cameras with larger pixels which can handle the oversampling. The Exmor does not appear to have color artifacts, even with narrowband illumination.
  • #1
Andy Resnick
Science Advisor
Education Advisor
Insights Author
7,415
3,108
I had mentioned that I dropped the idea of posting a lens testing lab I am trying to develop for class here because the technical details got out of control, but I didn't post any images showing why I came to that conclusion.

Here are 4 images of the central region of a Richardson test slide, using 1) Point Grey Research 'Flea' camera, a monochrome camera with pixels 4.6 microns on a side; 2) QImaging Rolera-MGI plus, an EMCCD monochrome camera with pixels 15 microns on a side, and two images (RAW and JPG) using the Sony Exmor chip which has a Bayer filter. All camera processing (sharpening, contrast, etc) was turned off.

The test slide is a black-and-white transmission object (chrome on glass), and was illuminated at full condenser aperture. The pattern was carefully aligned to the pixels in order to emphasize the sampling artifacts. The Flea camera outputs 8-bit BMPs, the QImaging camera outputs 16-bit TIFFs, and the Sony RAW was handled via Gimp2- no image was adjusted other than cropping.

To review, the Bayer filter is a color filter array printed directly on the chip to enable color imaging using a single sensor:

http://en.wikipedia.org/wiki/Bayer_filter

I was trying to generate artifacts due to the pixels, this can most easily be done by using a low-magnification, high numerical aperture lens- I used my 10X NA 0.3 objective. The size of the Airy disk, at the image plane, is about 10 microns in diameter. This means the QImaging camera is not appropriate- the pixel size is much larger than the PSF, so there will be errors due to undersampling the image.

I enlarged each image 8x (no interpolation) to more easily visualize the individual pixels, and uploaded the images to imageshack as TIFs, because I recently realized that imageshack compresses JPGs on its own. Hopefully the images will be displayed here as I see them on my own display...

Ok- here's the Flea camera:

[PLAIN]http://img249.imageshack.us/img249/6112/flea1.png

(note- this looks correct, so I think all the images will come across as intended)

And here's the QImaging:

[PLAIN]http://img406.imageshack.us/img406/8496/qimaging1.png

It's clear what the artifacts from undersampling are- not only are the test bars not resolved, the top row/right-most column of bars are not of equal intensity- those bars are 4 microns across, incommensurate with a 15-micron sized pixel.

Even so, both of these images could be analyzed in terms of a contrast transfer function because they are both from a regular array of sensor elements, and can be analyzed using some version of linear-shift invariance:

http://spie.org/x648.html?product_id=853462

Now to the Exmor: here's the JPG (top) and the RAW (bottom):

[PLAIN]http://img256.imageshack.us/img256/7332/dsc14331.png

[PLAIN]http://img820.imageshack.us/img820/606/dsc14332.png

Again, the JPG is produced in-camera, and I converted it into a TIFF in order to control the imageshack step. The main artifact is *color*. Also note there is no difference between the JPG and RAW- in fact, I don't think I could tell which image was which, if I didn't carefully keep track of the two images.

This is not unknown- color artifacts are often seen at sharp edges using digital SLRs etc. The origin is simple to understand- there is a sub-pixel high-contrast feature, and the Bayer filter cannot present the image reconstruction algorithm with complete information, and so color information is generated by the software.

What is interesting is that the Exmor images appear close to the Flea image, in terms of spatial resolution. Going by the manufacturer specifications, each sensor element of the Exmor is approximately 6 microns on a side (each Bayer unit is 12 microns on a side)- this gives me confidence in how Sony decodes the CMOS sensor information and re-packages it as a pixel array. This means I could (probably) treat greyscale images from the Exmor as equivalent to a monochrome CCD array, but other cameras will probably behave differently- that could be part of the lab...
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
From the it's-not-a-bug-it's-a-feature department, I tried the same imaging method, but instead of using broadband light (100W halogen), I used narrowband light to try and isolate the different Bayer color channels. To do this, I illuminated with 380 +/- 5 nm for the blue, 501 +/- 8 nm for the green, and a long-pass filter that cuts on at 645 nm for the red. This means the red and blue images are reconstructed using only 25% of the sensors, and the green channel is reconstructed using 50% of the sensors. Here are the images, cropped and scaled as before:

[PLAIN]http://img24.imageshack.us/img24/2158/dsc14602.png

[PLAIN]http://img267.imageshack.us/img267/5002/dsc14592.png

[PLAIN]http://img545.imageshack.us/img545/900/dsc14612.png

The green light excited both the green and blue sensors, so that one should be ignored. What is surprising is the lack of color artifacts in the blue and red images- there may be a hint of red in the blue channel, but that could be filter leakage at the sensor- 380nm is far blue, and the red filter may not block all of the light. Clearly, the image reconstruction firmware/software is fairly sophisticated.

Note that the level of resolvable detail for the red and blue is about half that of the green- the 2 micron bars can be resolved in the green but not the blue or red.
 
Last edited by a moderator:
  • #3
Here's another one: I was taking pictures of lens flare/glare for the photography class as part of the discussion on aberrations-

[PLAIN]http://img265.imageshack.us/img265/7051/flare1.jpg

And noticed something odd in the lower corner- here it is:

[PLAIN]http://img5.imageshack.us/img5/280/flare2.jpg

I think that is light reflected off the chip and re-imaged. At least, that's the only thing I can think of that has that regular pattern. In IR imaging systems, that's called "narcissism"- the camera images itself.
 
Last edited by a moderator:

1. What is a Bayer filter?

A Bayer filter is a color filter array that is placed on top of an image sensor in a digital camera. It allows the sensor to capture color information by filtering red, green, and blue light separately.

2. How does a Bayer filter affect image quality?

A Bayer filter can affect image quality in several ways. It can cause color fringing or color artifacts, reduce the overall sharpness of the image, and decrease the amount of light that reaches the sensor, resulting in lower image quality.

3. What are the different types of Bayer filter patterns?

There are four main types of Bayer filter patterns: RGGB, GRBG, GBRG, and BGGR. These patterns vary in the placement of red, green, and blue pixels on the sensor, and can affect the color accuracy and sharpness of the resulting image.

4. Can Bayer filter effects be corrected in post-processing?

Yes, Bayer filter effects can be corrected in post-processing through demosaicing, which is the process of reconstructing a full-color image from the raw data captured by the Bayer filter. This process can help reduce color artifacts and improve overall image quality.

5. Are there any alternatives to using a Bayer filter?

Yes, there are alternative color filter arrays that can be used instead of a Bayer filter, such as X-Trans and Foveon sensors. These alternative filters aim to improve color accuracy and reduce the effects of Bayer filters on image quality.

Similar threads

  • General Discussion
Replies
2
Views
2K
  • Electrical Engineering
Replies
5
Views
3K
  • Other Physics Topics
Replies
1
Views
2K
Back
Top