1. The problem statement, all variables and given/known data How and why does quality factor change in a series RLC circuit when the resistance in the circuit increases? I've made a small experiment circuit which looks like one in the picture: The current frequency is 1 kHz and the voltmeter should show 2V throughout the whole experiment. Inductance L = 100 mH and capacitance C = 47nF. Now, in the first part of the experiment, I use R = 470 Ω. I get that the max current at the resonant frequency is Imax = 4.13 mA and the resonant frequency is fr = 2436 Hz. Then I calculate boundary frequencies ( left and right of the resonant frequency, where the current Ig = Imax/sqrt(2)) and boundary frequencies are: fg1 = 2076 Hz and fg2 = 2920 Hz, which makes the frequency bandwidth Δf = fg2 - fg1 = 844 Hz. Then, the quality factor is: Q1 = 2.88 Now, for the second part of the experiment, I use R = 1 kΩ. I get that Imax = 1.97 mA at the resonant frequency fr = 2420 Hz (it's probably the same as the resonant frequency in the first case, but the instruments aren't as precise). Boundary frequencies are: fg1 = 1706 Hz and fg2 = 3234 Hz, which makes the frequency bandwidth Δf = 1528 Hz. Then, the quality factor is: Q2 = 1.58 My question is, why is the quality factor dependent upon the resistance? Why does the resistance affect the frequency bandwidth and quality factor (for lower R, quality factor is higher and vice versa)? 2. Relevant equations Just theoretical explanation. 3. The attempt at a solution The experiment explained above.