Question about transformers

  • Thread starter harblargh
  • Start date
  • #1
3
0
Okay, now this may sound ridiculously stupid, but I know next to nothing about transformers and so I have a question:

Does the input current on a transformer have to match the rated output in order to get that output?

(i.e. A trans rated 5 amps putting out the full rated current on only 2.5 amps of input current.)
 

Answers and Replies

  • #2
negitron
Science Advisor
842
1
No, but depending on the regulation (essentially a measure of how efficient the transformer is) the output voltage maybe be dependent upon load--put another way, in order to get the stated output voltage given the correct input voltage, it must be run at or very near the rated load current. For transformers with good regulation, this is less important, but for those with relatively poor regulation, running it at much less than full rated load can result in a significantly higher output voltage. This may or may not be important, depending on the application's tolerance for voltage deviation.

However, to answer the question I think you're asking, the input current will be equal to the output current divided the turns ratio (the voltages will follow the inverse of this). For example, if your transformer has a load of 10 amps on the output and it's a 120-to-12 volt stepdown transformer, the input will be 10 A / (10/1) = 1 A, assuming an ideal transformer.
 
  • #3
3
0
Oh. Thanks for the help.

The problem I just can't wrap my brain around is the actual flow of current. That's the issue.

For this hypothetical transformer, the rating is 50A with a matching load run though a VFD with a max load of 75V. Ergo, at maximum, the output would have to be [email protected]

What amperage would the 100V input would have to be to reach the maximum output rating?
 
  • #4
negitron
Science Advisor
842
1
Assuming decent regulation the loaded voltage ratio will roughly approximate the turns ratio (for a much more accurate figure, take the no-load voltage ratio). So, you can plug those figures into the equation in my previous post [input current = output current / (primary turns / secondary turns)] and that will be your answer.
 
  • #5
vk6kro
Science Advisor
4,081
40
Maybe it would help to look at the power involved.

Assuming a resistive load....

power in load = 75 volts times 50 amps = 3750 watts

The transformer is 100% efficient so....

3750 watts at 100 volts must mean a current in the primary of
(3750 watts / 100 volts) = 37.5 amps
since power = voltage times current.

This is the same answer as you get if you divide the output current by the turns ratio as above.
ie 50 amps /( 100 / 75 ) =37.5 amps.
 
  • #6
3
0
Ah, I get it now. Thanks alot, I really appreciate it. :biggrin:
 

Related Threads on Question about transformers

  • Last Post
Replies
8
Views
4K
Replies
13
Views
2K
Replies
6
Views
1K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
20
Views
6K
Replies
12
Views
918
Replies
1
Views
2K
Replies
3
Views
828
  • Last Post
Replies
2
Views
3K
Replies
16
Views
5K
Top