Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

What is more important? Power or voltage?

  1. Jan 22, 2012 #1
    Dear Forum,

    let's consider a certain load. The load has its own internal structure (resistors, capacitors, etc...).

    I would think that to give the load the most power, we need to provide the load terminals with the largest voltage. The current in the load will then be a consequence. Power is always P=VI.

    From the outside we always have control of the voltage: for instance, if we connect a certain device to a solar panel, we want the solar panel to emit the largest voltage. That way the load will receive the largest power....

    In a household, two different incandescent light bulbs are connected to the same voltage, 120 V. That is fixed. but the power ratings of the bulbs are different because the resistances of the bulbs are different: the bulb with the smallest resistance will emit the most power.....

    Moral of the story: if we want to send the most power to that load and we have control on the input voltage, the higher the voltage the higher the power, correct?
    Any caveat?

    High voltage means high power then. But there is also the maximum power transfer theorem....

  2. jcsd
  3. Jan 22, 2012 #2


    User Avatar
    Homework Helper

    Power is the key factor because power equals the amount of work done per unit time. Higher voltage allows the same amount of power to be delivered at a lower amount of current. This is why high powered appliances in USA households use 220 volts instead of 110 volts, to reduce the amount of current involved.
  4. Jan 22, 2012 #3

    Philip Wood

    User Avatar
    Gold Member

    The maximum power theorem deals with a case where you HAVEN'T got complete control of the supply voltage.

    The supply voltage falls when you put a load across it, because of the internal resistance, r, of the supply. A light load (high resistance load) will take only a small current, so there won't be much voltage drop across r, so the supply voltage V will be near its maximum (the supply emf, e). But the power, VI, won't be that large, because the current is small.

    On the other hand, if we make the load resistance low, the current will be high (its maximum is e/r), but the p.d. across the load will be low, because there's a big voltage drop across r, leaving little for the load. So again, the load power won't be large.

    The maximum power theorem (easy to prove) says that for maximum load power, the load resistance needs to be equal to r. Not at either extreme.
  5. Jan 23, 2012 #4
    not necessarily.

    Post #2 above reflects a power source with virtually fixed output voltage; post three
    reflects a supply and load in a typical design situation.

    And of course the phase of the voltage and current counts; the power factor is important.
  6. Mar 27, 2012 #5

    here my situation: I am using a small solar cell that will produce a certain amount of voltage at its two terminals when solar energy hits it.

    Any electronic device needs a certain amount of power to properly function. Now, according to the max power transfer theorem, impedance matching is important for the load to receive the max power. But we don't know the impedance of that specific electronic device. We only know that it takes a certain amount of power to work. How do we set the solar cell up so it is transferring the max power to the device? Does the max power get transfer when the electronic device impedance matches the resistance of the solar cell?

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook