ResonantW
- 9
- 0
Homework Statement
For an overdamped series RLC circuit, determine the coefficients \beta_1 and \beta_2 in the equation V=Ae^{-\beta_1t}+Be^{-\beta_2t} for the case where R=600 \Omega, L=100 \mu H, and C=.01 \mu F. Also determine the ratio of B to A.
Homework Equations
For a series RLC circuit, the general differential equation is \large \frac{d^2V}{dt^2} + \frac{R}{L} \frac{dV}{dt} + \frac{1}{LC} V = 0.
The Attempt at a Solution
I think I can just assume that V=Ae^{-\beta t}, take the necessary derivatives, and then solve the quadratic equation that results for \beta_1 , \beta_2.
After canceling the exponential terms, I get \beta^2 - \beta \frac{R}{L} + \frac{1}{LC} = 0.
The quadratic equation is \large \frac{\frac{R}{L} \pm \sqrt{\frac{R^2}{L^2} - \frac{4}{LC}}}{2}.
My numbers end up being \beta_1 = 5.828 * 10^6 sec, \beta_2 = 1.715*10^5 sec. Is this the correct answer? I don't know why they would be an entire order of magnitude separate...
Was it ok to assume the solution was not the sum of exponentials, but a single exponential which I take both roots of? I don't have much experience with differential equations.
Then for the A to B ratio, I know that \frac{dV}{dt} = 0, so I can use the \beta 's and I end up getting \frac{A}{B} = \frac{-\beta_2}{\beta_1}... is that right as well? I end up getting a huge ratio, like -33.98.
Thanks for the help!