cissey said:
In the circuit the LED's rated current is 20 milliamphers. With a voltage drop across the LED of 1.5 volts, calculate the value of the series resistance R1 necessary to be used if the supply voltage is 12 volts DC.
Nice job, describing your question and showing what you've worked out.
The next small learning steps are: (1) start this as a new thread (it is a different question from the LC one). (2) Notice we had your LC question moved over to the homework> introductory physics subtopic. If I were asking your LED question, I would probably post under: homework>engineering.
You're method is correct, using Kirchhoffs Voltage Law around circuit path.
Vs-V_led - (I R_lim) = 0
12-1.5 - (0.20)R_lim = 0
R_lim= (12-1.5)/0.20 = 525\Omega
Depending on the extent of your assignment, you may be able to stop there. But if you were asked to choose a resistor, they don't make 525\Omega resistors. To choose a practical one, you can find standard values on a http://ece-www.colorado.edu/~mcclurel/resistorsandcaps.pdf" . You don't want to exceed the current rating spec, so your limiting resistor needs to be \ge 525\Omega Using the table of std. values, the closest would be 560\Omega
For practical applications, you wouldn't want to run the LED continuously at 20mA, as it can shorten the life of your LED, causing the junction to break down. One http://www.etcs.ipfw.edu/~linm/2005Spring/cpet190/suppl/LED_Rs/LED_Rs.html" , for choosing a minimum limiting resistor value, is one that operates at 80% of the rated current. (20mA x 80%) = 16mA. R= (12-1.5)/0.016 = 656\Omega. Again choose a standard resistor greater than or equal to the calculated value, from the table. Other practical considerations to consider are; choosing tolerance and wattage rating for your resistor. If you haven't covered those concepts yet, you probably will soon.