thephysicist
- 9
- 0
The time t taken by a capacitor of capacitance C in a charging circuit with a resistance R in series with it to accumulate charge q is given by the equationt=τln(q/(Q−q)),where τ is the time constant given by τ=RC and Q is the maximum charge the capacitor can have when fully charged in that circuit.
In order to find the time taken by the capacitor to get fully charged we have to put q=Q in the right side of the above equation that gives
t=τln(Q/(Q-Q))
or t=τln(Q/0)
I know Q/0 does not have a precise meaning but even if we take it as ∞ for the sake of further reduction, then
t=τln∞
or t →∞
This gives me a feeling that a capacitor never gets charged fully. Am I right? Why not?
In order to find the time taken by the capacitor to get fully charged we have to put q=Q in the right side of the above equation that gives
t=τln(Q/(Q-Q))
or t=τln(Q/0)
I know Q/0 does not have a precise meaning but even if we take it as ∞ for the sake of further reduction, then
t=τln∞
or t →∞
This gives me a feeling that a capacitor never gets charged fully. Am I right? Why not?