# DC biasing using voltage divider

1. Jan 30, 2006

### david90

how do I use a voltage divider to biasing a signal that goes from -5v to 5v so that it goes from 0 to 2.5?

2. Jan 30, 2006

### chroot

Staff Emeritus
It's not possible. A voltage-divider can only change a signal's amplitude, but cannot change its mean value.

You're hoping to put a signal of 0V (the midpoint of -5 to 5V) and get out a signal of 1.25V. That'd be quite a feat for nothing but a couple of resistors.

You'll need to use an op-amp, or at least capacitively-couple your input to a bias network (perhaps made out of a voltage divider).

- Warren

3. Jan 30, 2006

### Staff: Mentor

No, I think it can be done in this case. But since this sounds like a homework problem, I'll start by only giving a couple of hints to David.

Make a 3-resistor network with two resistors in series from 10V to -10V. Couple your input voltage into the middle of these two resistors using another resistor. Make the top resistor 10K, the bottom resistor 20K, and the input resistor 10K. Now vary the input voltage between -5V and 5V, and calculate what happens at the output (the intersection of the 3 resistors). Do you see a way to get the voltage offset and scaling that is asked for in the problem? Keep in mind that you can set different voltages on the top and bottom of the voltage divider if that helps. Just write some simultaneous equations....

Last edited: Jan 30, 2006
4. Jan 30, 2006

### Staff: Mentor

BTW, in a real world application, you would need to consider the source impedance and the input impedance of whatever circuit came next after the voltage biasing network. In the case of this homework problem, you can probably assume that the output impedance of the source voltage is negligibly low, and the input impedance of whatever is using the output voltage is very high.