- #1
lulzury
- 8
- 0
Homework Statement
In an oscillating series RLC circuit, with resistance R and inductance L, find the time required for the maximum energy in the capacitor during an oscillation to fall to 1/5 its initial value. Assume q = Q at t = 0
Homework Equations
## U(t) = \frac{Q^2}{2C}e^{\frac{-Rt}{L}}\cos^2(\omega't+\phi) ##
## U_0 = \frac{Q^2}{2C} ##
## \omega' = \sqrt(\frac{1}{LC} - \frac{R}{2L}^2) ##
The Attempt at a Solution
## 0.5 \frac{Q^2}{2C} = \frac{Q^2}{2C}e^{\frac{-Rt}{L}}\cos^2(\omega't+\phi) ##
## 0.5 = e^{\frac{-Rt}{L}}\cos^2(\omega't+\phi) ##
Since q = Q @ t=0, I dropped the phase angle:
## 0.5 = e^{\frac{-Rt}{L}}\cos^2(\omega't) ##
I'm not even sure on how to begin solving this without getting that t out of the cosine squared. To do this I would either need to add some other ## \sin(t)^2 ## to get ## \cos(t)^2 + \sin(t)^2 = 1 ## on the resulting function or I would need to find some expression ## \cos(t) = \frac{adj}{hyp} ##
Any hints, tips? Thanks.