# Back EMF on oscilloscope

Hello all,

I'm not quite sure if this is a physics question or an EE question because it came up in an electronics design course for physicists with mostly engineering motivation, anyways here goes:

I have an RLC band pass filter with a sinusoidal AC voltage input (Agilent function generator). I am measuring the input voltage with an oscilloscope but it is measuring at around 768mV and not at 1V; the input voltage is attenuated by about a third. Is this being caused by a back EMF produced by the inductor?
I'll try and make a diagram below to show what kind of filter it is:

Code:
                   --------- L -------
Vin----R------|......................|----------Vout
----------C-------
Basically the capacitor and inductor are in parallel with each other and actually go to ground.

The scope is measuring at Vin,
R = 100 ohms,
L = 100 microHenries,
C = 1 microFarad,

What I am thinking is that the apparent droop that I am seeing on the scope is caused by a back EMF from the inductor, is this what is really happening?
I am interested in knowing the true cause!

Last edited by a moderator:

## Answers and Replies

Related Electrical Engineering News on Phys.org
Agilent function generators typically have a nominal output impedance of of 50 Ohms. I suggest you calcuate the expected "input voltage" as the voltage across the RLC portion, when this RLC portion is in series with an ideal 1 V sinusoidal AC source and a 50 Ohm resistor.

vk6kro
Click on this:

That circuit will be resonant at about 15.9 KHz. At frequencies well away from resonance. the 100 ohm resistor is the main load on the signal generator.

So, if the signal generator has an output impedance of 50 ohms, then the 100 ohm load will drop the output to about 2/3 of a volt, as you found.

However, the main point of the circuit is the behaviour at 15.9 KHz. Did you see this resonance?

I did see a resonance frequency (I also calculated it to be 15.9kHz) however it actually occurred around 6kHz, and the gain was only up around -4dB; it didn't reach 0dB :(
Are my instruments not sensitive enough?

I understand the reason for the voltage droop now, low input impedance + high output impedance, not quite sure about what is going on with my crappy gain findings

vk6kro
That sounds like one of the components was not the correct value.

This would probably be the inductor but it would have to really be 700 µH to drop the frequency so far.

There will not be any gain. At resonance, the output will rise towards 1 volt and, at frequencies away from resonance, the output will approach zero.

Like this:

[PLAIN]http://dl.dropbox.com/u/4222062/R_parallel%20LC%20sweep.PNG [Broken]

Last edited by a moderator:
Are you using a X10 probe on your oscilloscope?
A X1 probe can cause all kinds of weird problems.

I have the scope and the generator hooked up to a breadboard using coax cables and the circuit is jumper'd together and ground on the breadboard

Also, I did see a resonance peak but the frequency was too low and I didn't really see a 0dB attenuation, it reached about -4dB

vk6kro
Just noticed this in your first post:

The scope is measuring at Vin,

The scope should be measuring across the output, ie across the L/C parallel components, to observe resonance. Be sure to get the ground lead of the scope probe connected to the same side of the L/C components as the function generator ground.

If you tried it again, the 100 ohm resistor is a bit too small for a good resonance. 1000 ohms would be better. This should give a nice big peak at resonance.

An ordinary 100 uH inductor will have a Q of less than 30. That means that at resonance the inductor-capacitor combination will have a maximum impedance of 300 ohms. Your output voltage will be 300/(100 + 300) or 0.75 times the input voltage. Your measurement of 768mV is very close to that value.

Hi guys, thank you all for your responses,
I talked to my professor today, and he had a 100 ohm resistor in there intentionally to show the effects of low input impedance and relatively higher output impedance, but your input was very helpful, thank you.

Are you using a X10 probe on your oscilloscope?
A X1 probe can cause all kinds of weird problems.
...because it introduces an additional 50 ohms in parallel with the source.

vk6kro
...because it introduces an additional 50 ohms in parallel with the source.
Do you mean an extra 50 pF?

The short length of cable used in an oscilloscope probe adds about 50 pF to the oscilloscope's input impedance of 1 megohm in parallel with 50 pF.
So, the result is about 100 pF in parallel with 1 megohm.

A times 10 probe increases the input impedance to 10 megohms and reduces the capacitance to about 10 pF. It also reduces the signal by a factor of 10.

However the tuned circuit in this question is so heavily loaded with about 150 ohms across it, and it uses a 1 uF capacitor, that the effect of even 1 megohm and 100 pF would be negligible.