Is a series resistor necessary when charging a capacitor?

In summary, the use of a resistor between the inverter and the capacitor is not necessary but may have benefits such as smoothing the DC output and controlling the current drawn from the source. A diode is also not needed in this circuit. The time it takes to charge the capacitor and the residual voltage remaining after discharging depend on factors such as the power output of the source, efficiency of the DC-DC converter, inductance of the coil, and resistance of the coil.
  • #1
Slem
12
0
I'm using 250V 2200uF capacitor to discharge high instantaneous current into an inductor as shown in the attached image.

I'm using a 12 Volt battery fed into an inverter that output 220 VAC which then is fed into a rectifier which outputs 220VDC.

My question; Can I connect the 220 VDC directly into the Capacitor terminals (activated by push button) without the use of a resistor in series? I just want fast charging.

And when i activate the discharge circuit would I decrease the current flowing through it because the capacitor is still connected to the charging circuit? (even though the switch is open)

I need as much current as possible to flow through the inductor . Would it help if i connect a diode between charging circuit and capacitor ? to block any minimal current wasted as heat flowing back. I'm not sure about what I just said I'm only guessing.
Thank you!
 

Attachments

  • Big circuit_launcher_only.gif
    Big circuit_launcher_only.gif
    8.4 KB · Views: 566
Engineering news on Phys.org
  • #2
In general, you do not need a resistor between the inverter and the capacitor, but there are good reasons for having one. If your inverter is just taking the output from the rectifier then you would probably want a resistor/capacitor or inductor/capacitor to act as a filter to give you a smoother DC output (which is immaterial here because it seems like you just want to charge the capacitor to get a current pulse). Another reason would be to control the current drawn from your source. You are stepping up the voltage by a factor of 18, which means that you need 18 times the current out of your source to get X amps out of your inverter. This may not be a problem if you are using a battery due to the internal resistance but something else to consider, perhaps.

No point in connecting a diode between the inverter and the capacitor. Without a return path you are not going to have current going back that way. Not to mention you already have diodes there from the inverter's rectifier stage.

The main thing that is going to control the current through the inductor is the inductance, the resistance of the coil, and the voltage across your capacitor. What's going to determine the time it takes to charge the capacitor is the power output of your source and the efficiency of your DC-DC converter.
 
  • #3
Born2bwire said:
In general, you do not need a resistor between the inverter and the capacitor, but there are good reasons for having one. If your inverter is just taking the output from the rectifier then you would probably want a resistor/capacitor or inductor/capacitor to act as a filter to give you a smoother DC output

So my conclusion is that i don't really need a resistor for my application.
I just remembered another minor problem I had.. Whenever i discharged the capacitor I still had about 10V remaining in the cap . Is there a way I can make it discharge down to less voltage?
EDIT: I know perhaps by pressing the firing button longer (which isn't good for my application) I suppose my issue here is to decrease the time constant... But how ...
 
  • #4
Slem said:
So my conclusion is that i don't really need a resistor for my application.
I just remembered another minor problem I had.. Whenever i discharged the capacitor I still had about 10V remaining in the cap . Is there a way I can make it discharge down to less voltage?
EDIT: I know perhaps by pressing the firing button longer (which isn't good for my application) I suppose my issue here is to decrease the time constant... But how ...


That's simply going to dependent on the time you allow to discharge, the inductance of the coil, and your resistance (most likely dominated by the coil). There is always going to be some residual voltage because of the diode drop. But you are talking about a miniscule amount of energy here. 10 V residual represents 0.2% of the energy that you originally stored up in the capacitor. Plus, that 10 V is only going to be able to drive a fraction of the current that you could at the pulse maximum.
 
  • Like
Likes 1 person
  • #5


I would say that it is not necessary to use a series resistor when charging a capacitor, but it is recommended for safety and to control the charging rate. Charging a capacitor with a high voltage source can result in a sudden surge of current, which can be dangerous and potentially damage the capacitor. A series resistor can limit the amount of current flowing into the capacitor, providing a safer and more controlled charging process. Additionally, it can also help prevent overheating of the capacitor and other components in the circuit.

In the specific scenario described, using a 250V 2200uF capacitor and a 220VDC source, it is important to choose a resistor that can handle the voltage and current requirements of the circuit. A diode can also be used as a safety measure to prevent any reverse current flow from the capacitor back into the charging circuit.

As for the discharge circuit, it is important to ensure that the capacitor is fully discharged before activating it again. This can be achieved by using a discharge resistor or a discharge switch. Without proper discharge, there is a risk of electric shock and damage to the components.

In conclusion, while it is not necessary to use a series resistor when charging a capacitor, it is recommended for safety and control. It is important to carefully consider the voltage and current requirements of the circuit and choose appropriate components to ensure safe and efficient operation.
 

1. Is a series resistor necessary when charging a capacitor?

Yes, a series resistor is necessary when charging a capacitor because it limits the current flowing into the capacitor and prevents damage to the capacitor and other components in the circuit.

2. Can a capacitor be charged without a series resistor?

Technically, a capacitor can be charged without a series resistor, but it is not recommended as it can cause a surge of current that can damage the capacitor and other components in the circuit.

3. What is the purpose of a series resistor in charging a capacitor?

The purpose of a series resistor in charging a capacitor is to limit the current flowing into the capacitor and prevent damage to the capacitor and other components in the circuit.

4. How do I calculate the value of the series resistor for charging a capacitor?

The value of the series resistor can be calculated using the formula R = (V - Vc) / I, where R is the resistance in ohms, V is the supply voltage, Vc is the voltage across the capacitor, and I is the desired current limit in amperes.

5. Can I use any value of series resistor for charging a capacitor?

No, you cannot use any value of series resistor for charging a capacitor. The value of the resistor should be carefully chosen based on the supply voltage, capacitance of the capacitor, and desired current limit to ensure the capacitor is charged safely and efficiently.

Similar threads

Replies
3
Views
1K
  • Electrical Engineering
Replies
1
Views
785
  • Electrical Engineering
Replies
2
Views
934
  • Introductory Physics Homework Help
Replies
20
Views
371
  • Electrical Engineering
Replies
15
Views
1K
  • Electrical Engineering
Replies
15
Views
1K
Replies
2
Views
797
Replies
6
Views
936
Replies
30
Views
1K
  • Electrical Engineering
Replies
7
Views
852
Back
Top