Why Does My 1450 nm IR LED Not Illuminate Despite Correct Specifications?

  • Thread starter Thread starter ebunnyboy
  • Start date Start date
  • Tags Tags
    Ir Led Work
Click For Summary

Discussion Overview

The discussion centers around troubleshooting an infrared (IR) LED with a wavelength of 1450 nm that is not illuminating despite following the specified electrical characteristics. Participants explore potential issues related to current limiting resistances, power supply voltage, and methods for detecting IR light.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Experimental/applied

Main Points Raised

  • One participant reports that the IR LED does not illuminate even after replacing it with another identical LED, while a different LED at 940 nm works under similar conditions.
  • Another participant suggests that the maximum forward current rating of 100 mA should not be exceeded, as it could damage the LED, and emphasizes the importance of calculating the correct current limiting resistance based on the power supply voltage.
  • A participant mentions using a 5V power supply and observes a voltage drop of about 0.8V across the LED when using various resistances, leading to further testing with lower resistance values.
  • One participant calculates the resistance needed for a forward current of 50 mA and questions how the original poster is verifying the LED's IR output, noting that the human eye cannot see this wavelength.
  • The original poster clarifies that they are using a digital camera and a photodiode to detect the IR light and plans to test with a 50-ohm resistor after observing a voltage drop increase with lower resistance.
  • Another participant advises against using a resistance lower than 40 ohms, citing the maximum forward current and typical forward voltage drop.

Areas of Agreement / Disagreement

Participants express differing views on the appropriate resistance values and methods for detecting the IR output. There is no consensus on the exact cause of the LED's failure to illuminate, and multiple hypotheses are presented.

Contextual Notes

Participants reference various datasheets with potentially differing specifications for the LED, which may affect the calculations and assumptions made during the discussion. The power supply voltage and the method of detecting IR light are also noted as critical factors in the troubleshooting process.

ebunnyboy
Messages
3
Reaction score
0
hi , every body here , I really seek a help , I have an Ir led from thorlaps it's wave length 1450 nm , I followed the specs
Forward voltage at 20mA is = 1.2 v typical , 1.5 v Max
Max DC forward current = 100mA
Power dissipation is 120 mW

I tried the following resistance 200 ,220 ,190,180 ,100 ,63 ohms the led doesn't work , i thought it was damaged i replaced it with another one same happens but when replaced it
with 940 nm (have almost same specs ) it works

so what's the problem
here is the data sheet
http://www.thorlabs.com/thorProduct.cfm?partNumber=LED1450E

I noticed strange things when tring connecting 180 , 190 , 200 , 220 ohms the led have same voltage drop = 0.85 i keep lowering the resistance values until reached to 63 ohms then it gives 0.95 v and the led doesn't work under camera focus so what i shall do thanks
 
Engineering news on Phys.org
Your LED has a maximum forward current rating of 100 mA. Larger forward currents will cause the junction to fail; hence no IR light (how are you checking if there's an IR output?).

Anyway, to calculate the correct current limiting resistance, you will need to pick the forward current you will be operating it at; say 20mA for example. Then you need to know your power supply voltage; say it's 5V for example. Note that @ 20mA=0.02A the LEDs forward voltage will be 1.2V. Now use this equation to determine resistance to use:

d18b97cec426965d6e8708489095775b.png


http://en.wikipedia.org/wiki/LED_circuit#Series_resistor"

For our above example, R = (5V-1.2V)/0.020A = 190Ω

Not knowing what the power supply voltage you were using, I can't say for sure. But you may have over rated the forward current and blew the junction. Your LED can only dissipate 120mW before becoming damaged.
 
Last edited by a moderator:
Iam using 5 v power supply , I have three of them when testing using 190 ohms or 200 or 220
ohms the voltage drop across led is about 0.8 V , I tried to replace the led with anthor new one same happens that's led me to test it using low resistance
 
ebunnyboy said:
Iam using 5 v power supply , I have three of them when testing using 190 ohms or 200 or 220
ohms the voltage drop across led is about 0.8 V , I tried to replace the led with anthor new one same happens that's led me to test it using low resistance
Okay, I download the only datasheet I could find (http://www.datasheetdir.com/LED1450-03+LEDs" ) and for a forward current of 50mA, the forward voltage has a Maximum value of 1.5V but Typically for anyone part it's 1.0V. So if you apply the formula with the LED current=0.050A, the LED forward voltage drop=1.0V, and a supply voltage=5V,

R = (5V-1V)/0.05A = 80Ω

So my question to you is, how do you know it's NOT emitting IR. Your eye can't see this wavelength. What method are you using to see the IR?
 
Last edited by a moderator:
thnx to your reply
by using digital camera and also
introducing it to a photodiode has cover the spectral range

also datasheet from here
http://www.thorlabs.com/thorProduct...umber=LED1450E

its from throlaps

when i tried 63 the voltage drop increases to 0.9 v so I will go to buy 50 ohms and test it
 
Last edited by a moderator:
I wouldn't go much lower. At the maximum forward current rating of 100mA and with a typical forward voltage drop of 1V, the current limiting resistance of 40Ω would be the minimum value you should use.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
22K
Replies
10
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K