How Can I Boost the Voltage Without Increasing the Current Rating?

In summary, the conversation discusses the possibility of converting AC 50V, 1.8A into 100V, 1.8A while keeping the current rating the same. It is concluded that this is not possible with a transformer and alternative methods, such as using a voltage amplifier or a boost circuit, are suggested. However, it is noted that these methods also have limitations and it may be more practical to use a power supply or a car battery.
  • #1
mcmxc77
2
0
I want to convert ac 50V, 1.8A into 100V, 1.8A. The voltage is doubling, while the current rating is required to be the same. How can I achieve this? A transformer action step ups the voltage, but does it lowers the current rating?

Thanks
 
Engineering news on Phys.org
  • #2
mcmxc77 said:
I want to convert ac 50V, 1.8A into 100V, 1.8A. The voltage is doubling, while the current rating is required to be the same. How can I achieve this? A transformer action step ups the voltage, but does it lowers the current rating?

Thanks

Yes, if your AC source is only rated for 50V, 1.8A, then that is all the power it can supply. When you step up the voltage, you will overcurrent the source. What happens next depends on the design of the source.
 
  • #3
100 volts out at 1.8 amps into a resistive load would be 180 watts. Power = 100 volts * 1.8 amps.

So, to supply 180 watts out there must be at least 180 watts input to the transformer.

If the transformer was 100% efficient, this would be 180 watts / 50 volts or 3.6 amps.

Since the transformer is not going to be 100% efficient, the power from the 50 volt supply would have to be greater than this.

As you can see, it is not possible to do what you requested.
 
  • #4
how is it possible to increase the voltage without increasing the the current rating...

one method in my mind is to place a voltage amplifier, would tat help. if yes, then which configuratons of amplifier should i use?

thanks
 
  • #5
It isn't possible to increase the power from a transformer without getting power from somewhere else.

A voltage or power amplifier needs a DC power supply and this supplies the extra power that can be generated.
 
  • #6
mcmxc77 said:
how is it possible to increase the voltage without increasing the the current rating...

one method in my mind is to place a voltage amplifier, would tat help. if yes, then which configuratons of amplifier should i use?

thanks

There is no way to passively increase the voltage and not decrease the current in the process using a transformer, but if you need to go with a passive device and cannot use an active circuit for amplification for some reason, then you may want to do something like build a charging circuit that can take the incoming energy and with time store it to a level that can output a larger voltage at the same current. Of course it will only work for so long before the built up energy you have stored for this use becomes depleted and you'll have to wait again for the unit to charge back up to use it again at a greater voltage and the same current. Do a Google search on "boost circuit" and see what you can find.

This isn't a way to cheat and get energy from nowhere, but rather is a way to store and accumulate incoming energy to a level that you wish for useful electrical work to be done. The trade off is with time. You have to wait for the incoming energy to build to a value that you desire before you can use it to do work for you. Most likely the element that will store this energy for you for future use will be a capacitor, which you'll have to make sure is large enough to store an appreciable amount of charge or your stored energy will be useful for only a brief amount of time; also make sure you do not exceed the voltage rating of the capacitor.

The basic idea is to have a source attached to an inductor through a switch back to the battery. In between the inductor and switch put a diode pointing into a capacitor. Then when the power is applied a current flows through the inductor inducing a B field, then open the switch which induces a huge voltage across the inductor because di/dt will be very large, and the inductor voltage is directly proportional to this change in current with respect to time. So it will always forward bias the diode and shove charge into the capacitor raising its voltage level. When the inductors B field has died, close the switch and allow the inductor to build another B field and then open the switch again, causing the B field lines to quickly break over the windings inducing a voltage that will always forward bias the diode adding more charge to the capacitor raising its voltage even more. You can keep doing this until the capacitor voltage is at whatever value you desire (again as long as you do not exceed the rated voltage for that cap) at which point you can connect it to your load and drive it with a current, depending on how large the capacitor is, that is at first what you want for a current flow, but it decays to zero with time, but if the load is large in impedance and the capacitor is large in capacity, it will source a current that will slowly drop from the desired value to zero with time, as well as the voltage; that is as the capacitor discharges through your load the capacitor voltage drops also.

Honestly, you'd probably be better off just buying a power supply that can output the voltage and current you need and plug it into a wall outlet. If you do not have access to an outlet try a car battery (I'm assuming you need to bring up a DC voltage and keep the current what it was) connected to a power supply. The supply can kick the voltage up to what you want and the car battery has plenty of potential and will source a fair current for a far longer period of time than any energy stored within even a large capacitor.

Many Smiles,
Craig :smile:
 

What is meant by transformer current rating?

The transformer current rating refers to the maximum amount of electrical current that a transformer can safely handle without overheating or causing damage to the transformer or the connected circuit.

How is transformer current rating determined?

The transformer current rating is determined by the size and design of the transformer, as well as the type of material used for its construction. It is also affected by the ambient temperature, cooling methods, and the type of load connected to the transformer.

What factors can affect the transformer current rating?

The transformer current rating can be affected by various factors such as the type of load connected to the transformer, the ambient temperature, the cooling methods used, and the design and construction of the transformer.

Why is it important to consider the transformer current rating?

It is important to consider the transformer current rating because exceeding the rated current can cause overheating and damage to the transformer, leading to potential safety hazards and costly repairs or replacements.

Can a transformer's current rating be increased?

No, a transformer's current rating cannot be increased as it is determined by its design and construction. Attempting to increase the current rating can result in damage to the transformer and potential safety hazards.

Similar threads

  • Electrical Engineering
Replies
8
Views
938
  • Electrical Engineering
Replies
10
Views
929
  • Electrical Engineering
Replies
21
Views
1K
  • Electrical Engineering
Replies
8
Views
1K
Replies
6
Views
871
Replies
55
Views
3K
  • Electrical Engineering
Replies
13
Views
2K
  • Electrical Engineering
Replies
23
Views
2K
  • Electrical Engineering
Replies
26
Views
1K
Replies
20
Views
4K
Back
Top