- #1

- 194

- 1

## Main Question or Discussion Point

I have a source of 800 Kilo Ohms and I want to power an LED with that source. Which I would presume the LED would have a resistance of 9.5 to 10 Ohms if I'm correct. When I connect the LED to the source the voltage drops from 12 volts DC to around 1 volts DC. If my thinking is correct the reason this is happening is because of the impedance mismatching in which a high impedance source is driving a low impedance load and the voltage drops to a much lower value.

To correct this I want to put a resistor L pad between the source and load to hopefully minimize the voltage drop. The value of resistor I calculated being would be 800 Kilo Ohms resistor in series with the source and a 10 ohms resistor in parallel with the load.

Is this correct? And if not where am I wrong.

To correct this I want to put a resistor L pad between the source and load to hopefully minimize the voltage drop. The value of resistor I calculated being would be 800 Kilo Ohms resistor in series with the source and a 10 ohms resistor in parallel with the load.

Is this correct? And if not where am I wrong.