Register to reply 
Power and transformers 
Share this thread: 
#1
Apr2514, 02:28 PM

P: 31

Hello!
I'm still not clear on a few things: For example: Let's say a manufacturer creates a device which runs at 120W (let's say a lightbulb). Does this mean that i can use which combination of voltage and current such that V*I=120W i want? Another example: Let's say that the maximum voltage across a capacitor is 300v. Can i run it with as high current as i want as long as i don't exceed 300v? My second question is that instead of adding resistors to decrease current can't we just add a transformer instead which will decrease voltage to produce the same current? Thank you for answering! 


#2
Apr2514, 02:42 PM

P: 599

Capacitors also have internal resistance and possible dielectric losses that causes heating so they have AC current/power rating too. Your 'second' question is not really clear about which current you mean (primary or secondary). A transformer is not a magic box that makes current from voltage. Think of how the ratio of voltage and current (100v*1A > 1v*100A) might vary over a large range but still be equal to the same about of power (100W) and what resistance at each voltage (as we adjust the transformer turns ratio) would be needed to draw the needed current for 100W. 


#3
Apr2514, 03:14 PM

Mentor
P: 11,605

Let's say that we have a transformer with a primary average voltage (Vrms) of 100 volts. The secondary steps this down to 50 volts. If the load on the secondary side is consuming 100 watts of power, then the current through the secondary circuit is 2 amps, as 50 volts * 2 amps = 100 W. However, the primary has 100 volts, so the current flowing through the primary is only 1 amp. If we change the load so that it consumes 200 watts, then the current jumps to 4 amps in the secondary side and 2 amps in the primary side. Note that the voltage in the secondary side is set by the transformer itself and doesn't change. The current will change depending on the resistance (or impedence) of the secondary circuit, which includes the load. If you halved the voltage going to your lightbulb by using a transformer, then the current flow through the lightbulb would also be halved and the power would drop to 1/4 what it used to be. For a 100 watt bulb being supplied with 120 volts, the resistance of the bulb would need to be 144 ohms in order to get the required 830 milliamps of current so that V * I = 100 watts. If you used a transformer to drop the voltage to 60 volts, then the current flow through that same lightbulb is now only 417 milliamps and the power falls to 25 watts. (Because we didn't change the resistance of the bulb, which is still 144 ohms. So I = 60 volts / 144 ohms, which is 0.417 amps) Remember! We don't directly manipulate current itself! We have to change either the voltage or the resistance of something in order to get the right amount of current flow through it. 


#4
Apr2514, 03:16 PM

Sci Advisor
PF Gold
P: 11,952

Power and transformers
If the device is specified to work at. say 12V then a transformer will be needed, to give it the 12V it needs if you want to connect it to 240V mains. Say that device uses 12W  the current it draws will be 1A. The power supplied to the transformer from the mains (ignore losses) will still be 12W, so the primary current will be 12/240 = 1/20A. It's a matter of cause and effect, if you want to appreciate what happens. The mains will 'think' it is working into a resistance of R=V/I=240/(1/20) = 4800Ω, even though the actual device has a resistance of R = 12/1 = 12Ω. Transformers transform Resistance as well as volts and current. And, yes  a transformer is a much better way of supplying power to a low voltage device. You don't lose the sort of power that a series resistor will dissipate. 


#5
Apr2714, 10:41 AM

P: 31

So by using transformers we lose half of power in the primary? In which conditions do we use resistors in favor of transformers and vice versa? Which of them is more efficient? I understand how they work but i realy want to see which of them is more efficient? Thank you!



#6
Apr2714, 10:43 AM

Sci Advisor
PF Gold
P: 11,952

We lose no power through a transformer (except for a bit of inefficiency). If you understand how they work then isn't it obvious that no power is lost through a transformer?
Did I get my sums wrong in that post? 


#7
Apr2714, 10:47 AM

Sci Advisor
PF Gold
P: 11,952

If you have an AC supply, a transformer is always better  unless you need to vary the supply. Variable transformers (Variac) are available but tend to be more expensive than a low power rheostat. Alternatively, there are Thyristor controls ("dimmers") that let current in from the mains through in the form of a series of short pulses. These will deliver, effectively, low volts without much dissipation.



Register to reply 
Related Discussions  
Power plant transformers  Electrical Engineering  2  
Power Loss with and without Transformers  Introductory Physics Homework  4  
Electric Power measurement Current Transformers and reactive power  Electrical Engineering  8  
EM Radiation from Power Transformers  General Physics  2  
How can you get power used from step down transformers  Introductory Physics Homework  2 