Hi! I'm in a place where electricity goes out frequently; as a result I got a UPS which is powering my computer the total power input of which is less than 200W. The UPS is 300W or 600VA. The transfer time (time to switch from AC mode to battery mode) of the UPS is around 4 ms; but in that time the capacitors of the PSU of the computer discharges, as a result it turns off. This doesn't happen when the computer is taking less power -- the PSU capacitors are able to hold for 15-20ms in that case. The solution I found is to add a capacitor in parallel (rated 440V AC) to the output of UPS (or input of the computer); that improves the transfer time of the UPS by half an AC cycle or 10ms (for 50 HZ input). But I was wondering the consequence of doing that may be higher electricity bill; can it reduce the power factor in any way? Cause when the UPS goes in backup mode, for a few seconds regularly, it shows overload signal when the capacitor is connected. Or is it that the power measurement circuit of the UPS read the apparent power (that means the capacitor is reducing the power factor)? Also I heard the energy meter here reads the real power; but if that's not the case, I'm out of luck. The output of the UPS in backup mode is 'pseudo sine wave'; an engineer said it's square wave.