- #1

chris arnold

- 10

- 0

:From the book :

"As charge moves through the external circuit, it encounters a loss of 1.5 volts of electric potential. This loss in electric potential is referred to as a voltage drop. It occurs as the electrical energy of the charge is transformed to other forms of energy (thermal, light, mechanical, etc.) within the resistors or loads. If an electric circuit powered by a 1.5-volt cell is equipped with more than one resistor, then the cumulative loss of electric potential is 1.5 volts. There is a voltage drop for each resistor, but the sum of these voltage drops is 1.5 volts - the same as the voltage rating of the power supply"

I'm trying to figure out how to regulate porwer, Say you have a 9vDC - 200mA, and you are trying to run that threw a circut of resistor(s) to acheive a power supply of 9vDC @ 20mA. I see the simple solution of adding the resistence, i would need a 1.8watt-1.8(is that right?)ohms resistor. What i can't figure out is the voltage drop, in this case would be 4.5volts after the current is past the resistor, i think.

So, am i even right in thast calculation, or is it 0v.

Say i'm right with the 4.5v, that means i have a power source of 4.5vDC @ 20mA, when i need 9vDC, what would i do to solve this problem.

i imagine the volts are important, if you have something calling for 9v@20mA, you wouldn't want to give it half the voltage, the true amount of power is ohms correct? Ohms is the product of current in amps and voltage. so half the voltage at the same current is half the ohms/power. again, i ask if i'm right (just started learning about ohms law last week)