david90
- 311
- 2
how do I use a voltage divider to biasing a signal that goes from -5v to 5v so that it goes from 0 to 2.5?
The discussion revolves around the use of a voltage divider for DC biasing a signal that varies from -5V to 5V, with the goal of shifting this range to 0V to 2.5V. The conversation includes theoretical considerations and practical implications of using voltage dividers in this context.
Participants express disagreement regarding the capability of a voltage divider to achieve the desired biasing. While some believe it is not possible without additional circuitry, others propose a method that may allow for the desired outcome.
Participants highlight the need to consider source and input impedances in real-world applications, which may affect the validity of the proposed solutions. Assumptions about negligible output impedance and high input impedance are noted as relevant for the specific homework problem.
No, I think it can be done in this case. But since this sounds like a homework problem, I'll start by only giving a couple of hints to David.chroot said:It's not possible. A voltage-divider can only change a signal's amplitude, but cannot change its mean value.
You're hoping to put a signal of 0V (the midpoint of -5 to 5V) and get out a signal of 1.25V. That'd be quite a feat for nothing but a couple of resistors.
You'll need to use an op-amp, or at least capacitively-couple your input to a bias network (perhaps made out of a voltage divider).
- Warren