This should be a simple question yet I am struggling to see the benefit of using one over the other. I was building an LED and LDR based sensor head for a line following robot, and I needed to choose a method of connecting 3 LED's inside of it. I figured that the 3 options are: 1) All in series with a resistor 2) All in parallel, with the parallel bit in series with 1 resistor 3) All in parallel, each with their own resistor 4 important points: .The LED's are rated at 3V, 20mA, 60mW. This is what I am aiming for. .The LED's should all be as bright as possible, and more importantly they all need to be the same brightness (so that the sensor head is accurate). .The 5V is set, this cannot be changed, however the resistors can be any value at all. .I'm guessing the LED's are 150 Ohms as 3V/20mA = 150 Ohms, but the resistances of each may slightly vary due to manufacturing inaccuracies. The voltage and current through the resistors can be controlled equally well in each case as the value for the resistor in each case can be changed. I was told that option 3 is the best option, and option 1 is completely useless, but I don't see why. Is it something to do with the fact that a diode does not have a linear I-V characteristics? So the relationship between current and voltage is linear like with a resistor. I have always assumed that this doesn't matter for an LED, as the main reason there is a voltage across this is because energy has been converted to light, (like a resistor does as heat). This is unlike a diode whose "resistance" changes as you change the voltage across is. Please help! Thanks!