Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Formula contradiction?

  1. Jan 24, 2004 #1
    This has been racking my brain for the last week.. its about current..i keep trying to think of a circuit as a water hose, with voltage being the pressure, current being the flow, resistance being a kink in the hose, etc...

    Ohm's Law
    Current = Voltage / Resistance

    This is common sense...
    The higher the voltage, the higher the current
    Extremely low voltage ---> extremely small current

    that i can live with...

    Now, the WATTS Law
    Watts = Voltage * Current
    Current = Watts / Voltage
    In this case, current is inversely porportionate to the Voltage.

    WHY is it that when a power load is added to a circuit, a higher voltage will cause LESS current to be drawn?? To me that doesnt make sense when i try to picture it in my mind..I took this to the extreme..

    Current = 100W / 0.01V
    Current = 10,000 AMPS

    wow, 10,000 amps is quite abit of current considering i hardly have any voltage.

    we could take this more and more extreme.. hooking up a 2v small cell to a 100,000W power load.. theres 50,000 AMPS.. from almost nothing.

    I thought about this for awhile, and my only reasoning is that AMPS isnt necessarily the amount of current, but the SPEED of it. If i remember correctly, amperage is the amount of electrons to pass a point in a certain amount of time. Is this right? Even so, your telling me a 10000000000000000000000000000000000 WATT power load will suck electrons from a .00000000000000000000000000001 volt source at 999999999999 times the speed of light? I just don't see that happening. LOL.

    even Ohm's Law seems a little wierd. Once you have 0.000001 resistance, your amperage is in the millions.

    There has to be a simple explanation to these outrageous numbers!
  2. jcsd
  3. Jan 24, 2004 #2

    I think a more useful definition of Power is in order.
    Power = Energy/time.

    Power is simply the rate that energy is being converted per unit time.

    The relationship Current = power/voltage is correct but it is just a relationship, not a definition. You said that current is inversely porportionate to the Voltage. This is not what this formula is saying. What this formula is saying is that when voltage decreases, in order to maintain a constant power, the current would have to increase linearly. This would necessitate a decrease in resistance. If you just decrease voltage without changing any other factor, both current and power will decrease.

    Here are other relationships involving power: P=I^2R and P= V^2/R.

    Power equals current^2 times resistance.

    Power equals voltage^2 divided by resistance.

    In your original example, you asked 'WHY is it that when a power load is added to a circuit, a higher voltage will cause LESS current to be drawn?' then gave the equation: Current = 100W / 0.01V

    When you add something to a circuit which increase the rate at which energy is converted (increased power), the current is increased.

    Let's say we have circuit with is dissipating 100 watts of power and that the voltage is 10 volts. We know that the current being drawn is 10 amps. If you add a component to the circuit which increases the power dissipation to 200 watts, the current will double to 20 amps.

    On the other hand, if we change the voltage to 20 volts on our original circuit which was dissipating 100 watts, the power will increase by a factor of four. See the above formula P=V^2/R. By increasing the voltage, we haven't changed the resistance of our load. By ohms law, if you double the voltage, given the same resistance, the current will double. Since both current and voltage has doubled, power has increased by a factor of 4. P=VI.

    There is a simple explanation, your numbers are very unrealistic and outrageous. Don't take that as an insult. I understand you are just trying to gain an understanding of these things. A 2 volt cell would never be able to supply the demands required by a 100,000 watt load. There just isn't enough energy in such a cell to meet that power demand for any useful length of time. At 2 volts, the resistance of such a load would be 40 micro-ohms. That's a very small resistance for any practical device. A one meter length of 12 guage copper wire has approximately 100 times that resistance.

    When appliances are rated at a certain power, it is understood that this is for a particular voltage. Say a toaster is rated at 900 watts @ 110 V. The resistance of the toaster would be 13.44 ohms (R=E^2/P). If you hooked a 2V source to it, the power dissipated is no longer 900 watts. Since P=E^2/R, the power dissipated by the same toaster @ 2V is only .3 watts.

    You wrote: If i remember correctly, amperage is the amount of electrons (charge, positve or negative) to pass a point in a certain amount of time.

    That is essentially correct. This has nothing to do with speed. 1 amp is 1 coulomb of charge passing a given point in one second. If you have 2 amps, you have double the charge passing a given point in one second. Nothing is going faster.
    Last edited: Jan 29, 2004
  4. Jan 24, 2004 #3
    Here is a mathematical explanation of the other relationships involving power:


    [tex] V =I R [/tex]

    [tex] V=\frac{P}{I}[/tex]


    [tex] I R = \frac{P}{I}[/tex]

    arrange to solve for P...

    [tex]P = I^2 R[/tex]


    [tex] I = \frac{V}{R}[/tex]


    [tex] I = \frac{P}{V}[/tex]


    [tex] \frac{V}{R} = \frac{P}{V}[/tex]

    arrange to solve for P...

    [tex]P = \frac{V^2}{R}[/tex]
    Last edited: Jan 25, 2004
  5. Jan 30, 2004 #4
    lets make this simple voltage is joules of energy per coulomb of electrons amperes are coulombs per second so if i have alot of amps with little voltage I get alot of electrons moving but each carry very little energy. On the other hand if I have high voltage then I get more bang for my buck and i only need a few couloumbs or a few amps for the same amount of power. I believe where your getting confused is as voltage over a constant resistance increases so does the current and along with it the power. The only time you increase voltage and decrease current is if your trying to keep the same power.
  6. Feb 7, 2004 #5
    It is easier for me to understand the relation ships in even simpler terms:
    Try wrapping your mind around the fact that a rating in watts is nothing more than a measure of electrical energy required to operate a device properly according to its design (in the case of a light bulb or heater), and/or a measure of electrical energy required by a given circuit, and/or a maximum sustainable rating limit in the case of a design rating for a resister (ie. 1000 ohms ¼ watt --- do the math see what current vs. voltage it can sustain without burning up, within practical boundaries).

    Consider a 50 watt vs. a 100 watt light bulb:
    We know voltage to be constant (ie. 120 v), therefore the only variables are current and resistance. From this we can see, if twice the light is required (think amount of work), than double wattage light bulb is also required. you do the rest of the math.

    Try relating electrical theory to what you already know and understand. Your kinked hose was an excellent example of this!!

    Relate watts to making two equal distant trips with a pale of water, one trip full, one half full. The full pale trip will require more energy

    ******* think of watts as a measure of electrical work required or performed, for simplicity and ease of understanding.*******

    keep your calculations within limits of the voltage, current, and resistance that you are studying in class. don't confuse yourself with numbers that seem impossible (because they probably are).the reasons for this will come sooner than you think. be patient!!!!
    "necessity is the mother of invention" and no one needs a 2v 100 000 watt dc power supply, but if they did it would be the size of your home town.however, don't feel bad because there is no such thing as a stupid question, in fact i am sure i asked this very one, probably at the same stage of training.
    hope this helps.
    Last edited: Feb 7, 2004
  7. Feb 10, 2004 #6
    Actually, don't think of watts as a measure of electrical work. It is not a measure of work or energy. It is a measure of power. Power is the rate at which work is done (energy converted). One unit of measure for energy is the Joule. One watt is equivalent to one Joule/second. A 100 watt light bulb converts electrical energy at the rate of 100 Joules/Second. Over a period of 60 seconds, the work performed is 100 watts * 60 seconds or 6000 Joules. Energy = power * time.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook