We all know Ohm's law and power conservation, but in a basic transformer, they seem to be contradictory. For example, say we have a 50v source running through a 10Ω wire, then the amperage to the transformer should be 5 A and would come out to 250 watts. Then say the transformer steps the voltage down to 5v, then according to power conservation the amperage should be 50 A to come out to 250 watts. Here's where my problem comes in -- Ohm's law tells us I=V/R, therefore if we assume the wire the output of the transformer has also has a resistance of 10Ω then the Amperage would be .5 Amps. This doesn't really make any sense to me, any help is appreciated.