Why do transformers use AC currents in the primary coil?

AI Thread Summary
Transformers utilize alternating current (AC) in the primary coil because AC creates a changing magnetic flux, essential for inducing voltage in the secondary coil. While Faraday's law indicates that a voltage can produce a changing magnetic flux, a constant voltage would lead to a steady current, which does not generate the variable magnetic field required for transformer operation. The discussion highlights that using direct current (DC) would result in a constant magnetic field after an initial transient period, failing to induce continuous voltage in the secondary coil. Additionally, real-world transformers cannot handle infinite current, which would occur if a DC source were used without interruption. Therefore, AC is necessary for effective transformer functionality.
sarahwill
Messages
5
Reaction score
0
Hello.
If we assume an ideal transformer, why do we use a.c current in a transformer? I've searched and answers are that we need a changing electric current to produce a change in magnetic flux, using Ampere's law.
However, Faraday's law states that a voltage produced a changing magnetic flux as well. Isn't there a voltage across the primary coil for current to flow, that would cause a changing magnetic flux? I have tried the maths, and deduced the transformer equation.

If the primary voltage V is the negative of the rate of change of magnetic flux -dø/dt multiplied by the number of coils in the primary coil, then V = -Ndø/dt. But this changing magnetic flux would also induce a current in the secondary coil. If v is the secondary voltage, and n the secondary number of turns, then v = -ndø/dt. Taking the ratio of V/v, we have it being the same as N/n. Since it is a voltage, and not the rate of change of voltage that created the magnetic field, we do not need to vary the voltage to produce a changing flux, so even if the voltage was constant, a current would be induced.

So unless I am missing something, a.c is not required to generate magnetic fields. So why does a transformer use a.c? Thanks
 
Physics news on Phys.org
The problem is that the current will continue to rise without bound. With an open circuit secondary you get I = (V/L)t in the primary. Real world voltage sources can't produce infinite current, and real wold transformers can't handle it either.
 
sarahwill said:
However, Faraday's law states that a voltage produced a changing magnetic flux as well.
No, it does not. So all that follows is based on a flawed assumption.
 
nasu said:
No, it does not. So all that follows is based on a flawed assumption.
That's the whole point of the question. The law equates an induced voltage to the rate of change of flux of a magnetic field. So if there is an induced voltage, even if the induced voltage is constant, there should be a rate of change of magnetic flux through the coil right? What differentiates a constant voltage that could be set-up in the coil and thus allow a steady current flow from an induced voltage?
 
A variable flux may induce a constant emf (if the rate of variation is constant). A constant voltage applied to a circuit does not produce a variable field. With the exception of the short transitory regime during which the current reaches its constant value.

In order to use a DC source as input for a transformer you will have to turn it on and off by some device. Which is exactly what it is done in an old device called induction coil.
You can even do it by hand and get sparks between the secondary terminals.
 
nasu said:
constant voltage applied to a circuit does not produce a variable field. With the exception of the short transitory regime during which the current reaches its constant value.

Why does it reach a constant value? See #2.
 
That is for a coil without resistance. I was talking about a real transformer. The current is limited by the coil's resistance.
 
Back
Top