- #1

- 3

- 0

Does the input current on a transformer have to match the rated output in order to get that output?

(i.e. A trans rated 5 amps putting out the full rated current on only 2.5 amps of input current.)

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter harblargh
- Start date

- #1

- 3

- 0

Does the input current on a transformer have to match the rated output in order to get that output?

(i.e. A trans rated 5 amps putting out the full rated current on only 2.5 amps of input current.)

- #2

negitron

Science Advisor

- 842

- 1

However, to answer the question I think you're asking, the input current will be equal to the output current divided the turns ratio (the voltages will follow the inverse of this). For example, if your transformer has a load of 10 amps on the output and it's a 120-to-12 volt stepdown transformer, the input will be 10 A / (10/1) = 1 A, assuming an ideal transformer.

- #3

- 3

- 0

The problem I just can't wrap my brain around is the actual flow of current. That's the issue.

For this hypothetical transformer, the rating is 50A with a matching load run though a VFD with a max load of 75V. Ergo, at maximum, the output would have to be [email protected]

What amperage would the 100V input would have to be to reach the maximum output rating?

- #4

negitron

Science Advisor

- 842

- 1

- #5

vk6kro

Science Advisor

- 4,081

- 40

Assuming a resistive load....

power in load = 75 volts times 50 amps = 3750 watts

The transformer is 100% efficient so....

3750 watts at 100 volts must mean a current in the primary of

(3750 watts / 100 volts) = 37.5 amps

since power = voltage times current.

This is the same answer as you get if you divide the output current by the turns ratio as above.

ie 50 amps /( 100 / 75 ) =37.5 amps.

- #6

- 3

- 0

Ah, I get it now. Thanks alot, I really appreciate it.

Share: