Undergrad Determine emission spectrum of an LED

Click For Summary
The discussion revolves around determining the emission spectrum of 660nm LEDs, which appear orange rather than deep red. Participants suggest checking the LED datasheet, using a prism, or employing a digital camera to analyze the spectrum by capturing RGB values. Some experiments with a CD as a diffraction grating yield inconclusive results, leading to discussions about the limitations of inexpensive cameras and the potential effects of image compression on data accuracy. Overall, the conversation highlights the challenges of accurately measuring LED spectra and the need for better equipment or methods for reliable results.
  • #31
OmCheeto said:
I think my Bayer filter is broken. :oldgrumpy:

Daz said:
I too have seen this occur. It happens with CCD detectors where each pixel is essentially a potential well with a finite density of states. If the incident light intensity is high, the individual colour pixels saturate and charge starts spilling over into adjacent pixels. It’s called white-out and reducing the intensity should resolve it.

Yeah, I was wondering about blooming. The way it appears and is controlled controlled in CMOS is different than CCDs: on CCDs, blooming leads to vertical or horizontal streaking which I never see with a CMOS sensor.

The more I think about Bayer filters, the more confused I get. First, they are thin-film reflective type filters as opposed to absorptive filters, because absorptive filters would degrade over time, leading to inconsistent and nonuniform bandpass changes. On the other hand, thin film filters only work over a restricted range of incident angle, so high-angle rays associated either with fast lenses or wide angle lenses would not be correctly filtered. I have never noticed such a thing, and have not heard anyone else noticing that.

But they have to be reflective- here's an image of light reflecting off the filter, then reflected again by the lens and captured by the sensor:

flare2_zpshqbkimtb.jpg


This is clearly a reflection, but... if this was light initially rejected by the filter, then the colors should be inverted: blue reflects yellow, green reflects magenta, and red reflects cyan. Although maybe the colors are inverted- I can't tell because I'm color blind :(

Imaging the filter directly is a challenge, this is the best one I've been able to make (so far):

_DSC4567_zpsmunl2pw9.jpg


This image was taken using brightfield reflection microscopy, but the light must be reflecting off the underlying pixels, not the filter, because the colors are correctly rendered. I don't understand the 'half pixel' appearance... I guess mine is broken as well :)
 
  • Like
Likes OmCheeto
Physics news on Phys.org
  • #32
Andy Resnick said:
I don't understand the 'half pixel' appearance... I guess mine is broken as well :)
This is a pretty common Bayer pattern.

Notice that there are as many green pixels as there are red and blue pixels combined (or another way of putting it, twice as many green as red, twice as many green as blue). That's because the human eye perceives more detail in the green part of the spectrum. In the post-processed image luminance favors the green channel. By favoring green in pixel count, the silly human will perceive a more detailed image.
 
  • #33
Andy Resnick said:
This image was taken using brightfield reflection microscopy, but the light must be reflecting off the underlying pixels, not the filter, because the colors are correctly rendered.

Maybe the filter material (of a given pixel) reflects and transmits the same color, thus absorbing the other colors (neither reflecting nor transmitting the other colors*)? 'Like stained glass.

*meaning the other colors are not reflected, but they don't make it to the pixel element either.
 
  • #34
Yes, I’ve noticed that CMOS sensors don’t saturate in the same way as CCDs where you tend to get vertical white streaks in the image as the charge overspills along the read-out line. But I guess they do something similar, nonetheless.

Andy Resnick said:
On the other hand, thin film filters only work over a restricted range of incident angle, so high-angle rays associated either with fast lenses or wide angle lenses would not be correctly filtered. I have never noticed such a thing, and have not heard anyone else noticing that.

This is one of the reasons why mega-pixel cameras require the objective lens to be image-space telecentric. Virtually all modern lenses for nigh-resolution imaging are designed for telecentric output. With such a lens each pixel is illuminated more-or-less normally. Certainly, if you were to use an entocentric objective with a milti-megapixel image sensor you do get pixel shadowing and a shift in colour balance towards the corners of the image.
 
  • #35
Tom.G said:
If the camera has "automatic white balance", turn it off!

Even my new camera has no "Off" mode for "automatic white balance".
:oldcry:
As far as I can tell anyways.
It comes with 131 pages of instructions.

I'm pretty sure my old camera had one page of instructions.

ps. It took me 3 hours of googling yesterday, just to figure out how to transfer the pictures from my new camera, to my laptop.
:oldmad:
Things were a lot simpler, in the Canon A-1 days, of old.
 
  • #36
OmCheeto said:
Even my new camera has no "Off" mode for "automatic white balance".
:oldcry:
As far as I can tell anyways.
It comes with 131 pages of instructions.
1 Press MENU/OK to display the shooting menu.
2 Press the selector up or down to highlight the White Balance (WB) menu item.
3 Press the selector right to display options for the highlighted item.
4 Press the selector up or down to highlight the desired option. I suggest changing the option from AUTO to Sunlight for experiments involved in this thread.
5 Press MENU/OK to select the highlighted option (e.g., Sunlight).
6 Press DISP/BACK to exit from the menu.

When finished with experiments, repeat the process to switch back to AUTO white balance.

For more details, please see pages 68, 69, and 73 of the
FUJIFILM
DIGITAL CAMERA
FINEPIX S8600 Series
Owner’s Manual
 
  • #37
Daz said:
This is one of the reasons why mega-pixel cameras require the objective lens to be image-space telecentric.

I don't think that's true. In any case, my distagon 15/2.8 does not generate color anomalies, nor did I see any with my planar design 85/1.4.
 
  • #38
collinsmark said:
Maybe the filter material (of a given pixel) reflects and transmits the same color, thus absorbing the other colors (neither reflecting nor transmitting the other colors*)? 'Like stained glass.

*meaning the other colors are not reflected, but they don't make it to the pixel element either.

Well, like I said, using absorptive filters would seem to be a bad idea for various reasons- the absorbed energy degrades the dye over time, leading to all kinds of problems. For example, every time you took a picture that contains the sun, those particular filters would have a whole lot of absorbed energy to dissipate, you would probably have permanent bleaching of the filters every time.
 
  • #39
Andy Resnick said:
Well, like I said, using absorptive filters would seem to be a bad idea for various reasons- the absorbed energy degrades the dye over time, leading to all kinds of problems. For example, every time you took a picture that contains the sun, those particular filters would have a whole lot of absorbed energy to dissipate, you would probably have permanent bleaching of the filters every time.
Yes, there's that. I'm guessing though that the manufacturers don't really expect that the camera will still be in use after a decade and a half or so. Remember cameras from a decade and a half or so? they were very low resolution and sucked batteries dry in few hours of use. Does anybody still use a digital camera pre-2001?

Light loss from absorptive pigments or dyes is another issue. If the filter absorbs light, then there that much less light reaching the detector.

That said, I think most filters in color filter arrays (CFAs) are still either pigments or dyes. (Disclaimer: CFAs are not my field of expertise. I don't know a whole lot about them).
https://en.wikipedia.org/wiki/Color_filter_array

That's not the end of the story though. Here's an example from Panasonic, who seem to be working on a different approach (back from 2013, not sure how far this has progressed):
http://www.imaging-resource.com/new...-new-sensor-tech-ends-color-filter-light-loss
 
  • #40
collinsmark said:
<snip>That's not the end of the story though. Here's an example from Panasonic, who seem to be working on a different approach (back from 2013, not sure how far this has progressed):
http://www.imaging-resource.com/new...-new-sensor-tech-ends-color-filter-light-loss

Hmmmm.. it seems that color filter arrays are indeed absorptive:
http://cat.inist.fr/?aModele=afficheN&cpsidt=17387091

There does seem to be considerable innovation here- the Sony IMX189AEG chip uses a new approach as well:
http://www.sonyalpharumors.com/sr4-...sony-apcs-active-pixel-color-sampling-sensor/
 
  • #41
Tom.G said:
Thanks Andy, that's a good site. They have spectroscope for under $10.
pfft! Andy is obviously a "science enabler"

science.order.2016.02.28.png
I will never recover...
:oldcry:
 
  • Like
Likes Andy Resnick and e.bar.goum
  • #42
Tom.G said:
Thanks Andy, that's a good site. They have spectroscope for under $10.

OmCheeto said:
pfft! Andy is obviously a "science enabler"
:oldcry:

Those spectroscopes are really fun to play with- I've given them as gifts to my nieces and nephews. You may be able to find them on amazon for even less.

I accept the label 'science enabler'! :) If you feel like splurging, here's an excellent gift idea (yes, you can give gifts to yourself!):

http://www.haverhills.com/cgi-bin/store.cgi?&shop=city&L=eng&P=1062
 
  • Like
Likes collinsmark
  • #43
Just to wrap up a loose end- here is a decent image of the Bayer filter:

DSC_3668s_zpski9mmshv.jpg


The image was taken at the edge of the 'active area', which is why the image brightness is non-uniform. The darker region is where the circuitry is, here's an image of the traces:

DSC_3667s_zpschbppgvp.jpg


The traces, IIRC, are 1/2 micron across.
 
  • #44
I really thought I had it nailed. But then, I compared my data to the internets.

thought.I.was.getting.a.hang.of.this.but.then.png


I'm guessing its now a 2D vs 3D problem.

ps. To the OP, @Lotic7 , my red LED turned out to be very broad spectrum.
getting.the.hang.of.this.spectro.scheisse.png
 
  • #45
OmCheeto said:
I really thought I had it nailed. But then, I compared my data to the internets.

Your images look great! Why do you think (or, what is the evidence that) there's a problem?
 
  • #46
Andy Resnick said:
Your images look great! Why do you think (or, what is the evidence that) there's a problem?
I guess it's because no matter how hard I try, I can't get the lines to match up in all three images.
Here's my second attempt:

attempt.no.2.at.lining.up.the.Hg.spectra.png


By lining up the yellow(578ish nm) and bluish(502.5 nm) lines, my green line(546.1 nm) is a bit off to the left.

In any event, I went ahead and digitized my Omic spectrum, using @lpetrich 's most awesome tool, found an equation: y≈m λ D / d
where
m is the order
λ is the wavelength
D is the distance to the screen
d is the spacing in the grating

determined that "D" was something that would be boogered, as there are lenses and stuff in cameras that would throw that off.
So I decided that since "D" and "d" were both constants, I could kind of throw them out,
yielding me that with that at m=1,
y would be proportional to λ.

Given that I had 4 reference points for the Hg lamp, I determined from the blue and right yellow bars' positions and reference frequencies,

Code:
X       R     G     B   color         nm reference   Omic nm derived
57     39    60    66   blue          502.5          502.5
248    38   143    74   green         546.1          549.3
361   210   229   132   left yellow   577.0          577.0
369   231   224   125   right yellow  579.0          579.0

an equation that yielded wavelength, based on the x-coordinate: λ = 0.245 * x-coord + 488.5

Given that my green bar was only off by 3.2 nanometers, I decided to plot a graph for the red LED, which is what the OP originally was asking about:

red.led.intensity.vs.wavelength.png


Not quite right from what I've seen, for the intent.

Chlorophyll.jpg


But anyways, fun project. :smile:
 
  • #47
OmCheeto said:
I guess it's because no matter how hard I try, I can't get the lines to match up in all three images.

Your strategy (comparing relative lines pacing) is good, but if you are using a lens to image the spectra, if the lens has distortion (nearly all do) it will warp the line spacings. So you could actually be measuring the lens distortion- my guess from your image is that your lens has a few percent of barrel distortion.

OmCheeto said:
View attachment 97148

Not quite right from what I've seen, for the intent.

View attachment 97149

I'm not sure what you are showing here- the LED spectrum looks very reasonable (assuming the camera sensor has uniform sensitivity across that wavelength range), I'm not sure why you are comparing the LED emission spectrum to the absorption spectrum of chlorophyll.

OmCheeto said:
But anyways, fun project. :smile:

Most definitely!
 
  • #48
Andy Resnick said:
Your strategy (comparing relative lines pacing) is good, but if you are using a lens to image the spectra, if the lens has distortion (nearly all do) it will warp the line spacings. So you could actually be measuring the lens distortion- my guess from your image is that your lens has a few percent of barrel distortion.
It might have more to do with the fact that I'm using a $1.25 piece of equipment, and expect grand scientific results. o0)
2016.03.11.dont.expect.JPL.data.with.buck.25.equipment.jpg

Reflections of a square light fixture, from opposite sides of one of the slides.​
I'm not sure what you are showing here- the LED spectrum looks very reasonable (assuming the camera sensor has uniform sensitivity across that wavelength range), I'm not sure why you are comparing the LED emission spectrum to the absorption spectrum of chlorophyll.
That was my the inspiration for all of this experiment'en!
Lotic7 said:
Eventually I wanted to try to use the 660nm LEDs to grow some plants.
OmCheeto said:
Me too!

Not sure what the growing season is in Cleveland, but out here, it's pretty short.
And I love fresh basil.
My plan is to install a winter garden, in my kitchen.
Most definitely!

I have no plans of stopping, in the near future.
 
  • Like
Likes collinsmark
  • #49
OmCheeto said:
It might have more to do with the fact that I'm using a $1.25 piece of equipment, and expect grand scientific results. o0)

True- but it's also true that using a piece of equipment costing only $1.25, you were able to get grand scientific results :)

OmCheeto said:
Not sure what the growing season is in Cleveland, but out here, it's pretty short..

Depends what you're growing... :)
We have a small plot full of garlic, tomatoes, and hot peppers every year. The basics.
 
  • #50
Andy Resnick said:
True- but it's also true that using a piece of equipment costing only $1.25, you were able to get grand scientific results :)
I decided last night, while laying in bed, that I could just flip the diffraction grating around, and take two photos.
I believe this would tell me whether it is barrel distortion or grating warpage that is giving anomalous readings.

But, I do agree that this is really "grand".
I'm not sure if it was appropriate for me to freak out when I saw the two yellow bands emerge, and was measuring the difference of 2 nanometers.
2 nanometers, for me, is incomprehensibly small.

Depends what you're growing... :)
We have a small plot full of garlic, tomatoes, and hot peppers every year. The basics.

Well, I live in a suburban-forest, and any advantage I can find in growing things, really helps.
I've had garlic growing like gangbusters for the last month in my "appropriate for mushrooms" garden in the back yard, and my gutter garden in the front yard was a resounding success.

My idea is to build a mini-me version of a gutter garden in my kitchen, for use during the winter months.
By eliminating the 520-630 nm wavelengths, it looks to me like you can grow plants with a lot less power.

terra.firma.vs.what.plants.want.png

[ref 1: wiki: Sunlight at Earth's surface]
[ref 2: hyperphysics: Light Absorption for Photosynthesis]
 
  • #51
OmCheeto said:
I decided last night, while laying in bed, that I could just flip the diffraction grating around, and take two photos.
I believe this would tell me whether it is barrel distortion or grating warpage that is giving anomalous readings.

Seems reasonable enough- curious to know what happens...
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 40 ·
2
Replies
40
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 10 ·
Replies
10
Views
8K
  • · Replies 14 ·
Replies
14
Views
2K