|Oct31-11, 11:21 AM||#1|
Bayer filter effects
I had mentioned that I dropped the idea of posting a lens testing lab I am trying to develop for class here because the technical details got out of control, but I didn't post any images showing why I came to that conclusion.
Here are 4 images of the central region of a Richardson test slide, using 1) Point Grey Research 'Flea' camera, a monochrome camera with pixels 4.6 microns on a side; 2) QImaging Rolera-MGI plus, an EMCCD monochrome camera with pixels 15 microns on a side, and two images (RAW and JPG) using the Sony Exmor chip which has a Bayer filter. All camera processing (sharpening, contrast, etc) was turned off.
The test slide is a black-and-white transmission object (chrome on glass), and was illuminated at full condenser aperture. The pattern was carefully aligned to the pixels in order to emphasize the sampling artifacts. The Flea camera outputs 8-bit BMPs, the QImaging camera outputs 16-bit TIFFs, and the Sony RAW was handled via Gimp2- no image was adjusted other than cropping.
To review, the Bayer filter is a color filter array printed directly on the chip to enable color imaging using a single sensor:
I was trying to generate artifacts due to the pixels, this can most easily be done by using a low-magnification, high numerical aperture lens- I used my 10X NA 0.3 objective. The size of the Airy disk, at the image plane, is about 10 microns in diameter. This means the QImaging camera is not appropriate- the pixel size is much larger than the PSF, so there will be errors due to undersampling the image.
I enlarged each image 8x (no interpolation) to more easily visualize the individual pixels, and uploaded the images to imageshack as TIFs, because I recently realized that imageshack compresses JPGs on its own. Hopefully the images will be displayed here as I see them on my own display...
Ok- here's the Flea camera:
(note- this looks correct, so I think all the images will come across as intended)
And here's the QImaging:
It's clear what the artifacts from undersampling are- not only are the test bars not resolved, the top row/right-most column of bars are not of equal intensity- those bars are 4 microns across, incommensurate with a 15-micron sized pixel.
Even so, both of these images could be analyzed in terms of a contrast transfer function because they are both from a regular array of sensor elements, and can be analyzed using some version of linear-shift invariance:
Now to the Exmor: here's the JPG (top) and the RAW (bottom):
Again, the JPG is produced in-camera, and I converted it into a TIFF in order to control the imageshack step. The main artifact is *color*. Also note there is no difference between the JPG and RAW- in fact, I don't think I could tell which image was which, if I didn't carefully keep track of the two images.
This is not unknown- color artifacts are often seen at sharp edges using digital SLRs etc. The origin is simple to understand- there is a sub-pixel high-contrast feature, and the Bayer filter cannot present the image reconstruction algorithm with complete information, and so color information is generated by the software.
What is interesting is that the Exmor images appear close to the Flea image, in terms of spatial resolution. Going by the manufacturer specifications, each sensor element of the Exmor is approximately 6 microns on a side (each Bayer unit is 12 microns on a side)- this gives me confidence in how Sony decodes the CMOS sensor information and re-packages it as a pixel array. This means I could (probably) treat greyscale images from the Exmor as equivalent to a monochrome CCD array, but other cameras will probably behave differently- that could be part of the lab...
|Nov1-11, 08:05 AM||#2|
From the it's-not-a-bug-it's-a-feature department, I tried the same imaging method, but instead of using broadband light (100W halogen), I used narrowband light to try and isolate the different Bayer color channels. To do this, I illuminated with 380 +/- 5 nm for the blue, 501 +/- 8 nm for the green, and a long-pass filter that cuts on at 645 nm for the red. This means the red and blue images are reconstructed using only 25% of the sensors, and the green channel is reconstructed using 50% of the sensors. Here are the images, cropped and scaled as before:
The green light excited both the green and blue sensors, so that one should be ignored. What is surprising is the lack of color artifacts in the blue and red images- there may be a hint of red in the blue channel, but that could be filter leakage at the sensor- 380nm is far blue, and the red filter may not block all of the light. Clearly, the image reconstruction firmware/software is fairly sophisticated.
Note that the level of resolvable detail for the red and blue is about half that of the green- the 2 micron bars can be resolved in the green but not the blue or red.
|Nov8-11, 08:34 AM||#3|
Here's another one: I was taking pictures of lens flare/glare for the photography class as part of the discussion on aberrations-
And noticed something odd in the lower corner- here it is:
I think that is light reflected off the chip and re-imaged. At least, that's the only thing I can think of that has that regular pattern. In IR imaging systems, that's called "narcissism"- the camera images itself.
|Similar Threads for: Bayer filter effects|
|band pass filter and twin-T notch filter||Advanced Physics Homework||7|
|Bayer knowingly sold AIDS contaminated product?||General Discussion||14|
|Bayer Process||Biology, Chemistry & Other Homework||1|