How long can a 1000W UPS power a 100W device?

  • Thread starter Thread starter KFC
  • Start date Start date
  • Tags Tags
    Time
AI Thread Summary
A 1000W UPS can theoretically power a 100W device for a limited time, but actual runtime depends on the battery's energy capacity, typically measured in watt-hours. Sales claims of 4 to 5 hours are often misleading, as most home UPS units contain small batteries (3 to 7 Ah) that provide less than an hour of power at that load. The battery type significantly impacts performance; AGM batteries, while more expensive, offer faster charging and longer life compared to traditional lead-acid batteries. It's crucial to check the specific battery specifications, as many UPS units do not clearly state their capacity. Understanding these factors is essential for accurately estimating how long a UPS can sustain power for a given device.
KFC
Messages
477
Reaction score
4
Hello, today I am going to buy a UPS (Uninterruptible Power Supply) of Power 1000W. I am thinking if we have device (100W) as load and how long can the UPS supply power to that device?

I was told by sales that usually, UPS can work for 4 to 5 hours but do they get that time?
 
Physics news on Phys.org
Last edited by a moderator:
You could make your own...

Battery charger plugs into the wall and charges a 12.6 volt deep cycle battery. An inverter is connected to the battery and powers your devices. At long as the battery charger is getting power from the wall the battery stays at full charge and the power flows directly from the charger to the inverter. When the charger looses power the battery supplies power until it is discharged. This way you can increase your run time simply by adding extra batteries in parallel, without even powering down the system. In a pinch you could run some jumper cables in through your window and power it from your car for as long as you have gas.

This, of course, is a rather crude way of doing it that sacrifices efficiency for simplicity. If your charger and your inverter are both 80% efficient then you will loose 36% of the electrical energy that passes through the device. I don't know if that concerns you or not.
 
Last edited:
KFC said:
I was told by sales that usually, UPS can work for 4 to 5 hours but do they get that time?
Never believe salesmen! They may work 4-5 hrs if not supplying anything probably.
Just check what battery your UPS contains.
5 hrs of 100W is a capacity of typical car battery (48 Ah at 12V), but rather not of those small ones, which are usually built into UPS's.
Most UPS-es for home usage utilise 3, 4.5 or 7 Ah sealed batteries - less than 1hr of 100W.
 
Last edited:
I agree with the above comments...
UPS can work for 4 to 5 hours but do they get that time?

Of course how long it lasts depends on the energy of the battery and the power condumed by the load. For a typical 12 volt battery, you'll see various ratings...CCA (starting power at zero degrees) and the one of interest is the 20 hour amp hr rating...how long the battery will provide 20 amps until flat dead...but it best not to discharge a lead acid wet cell battery much past 50% as that shortens life...

Depending on what you are doing also consider an AGM battery,,,it will be a lot more expensive initially but will have the lowest life cycle cost per amp hour...in other words, it costs more but puts out a lot more amp hours...and it charges four to five times faster than a typical wet cell lead acid battery...

Also wet cell lead acid batteries give of hydrogen gas and some acidic fumes...so if the battery will be heavily used that may be a consideration in some environments. The AGM is sealed and so gives off no gases nor fumes...
 
On a side note, the USA doesn't require battery makers to include capacity in battery descriptions (most European contries do require this), so capicity is not stated for a lot of batteries sold in the USA (except for ones used for radio control models). In this case, you'll need to find the specs from the UPS maker, such as the APC chart linked in post #2.
 
I have recently been really interested in the derivation of Hamiltons Principle. On my research I found that with the term ##m \cdot \frac{d}{dt} (\frac{dr}{dt} \cdot \delta r) = 0## (1) one may derivate ##\delta \int (T - V) dt = 0## (2). The derivation itself I understood quiet good, but what I don't understand is where the equation (1) came from, because in my research it was just given and not derived from anywhere. Does anybody know where (1) comes from or why from it the...
Back
Top