Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Voltage divider problem

  1. Mar 6, 2006 #1
    I designed a circuit that has an analog signal conditioning portion, and a digital circuit to make some measurements on the signal.
    The analog circuit is powered by 15V supplys, thus the output is a 0-15V square wave. The digital circuit needs a 0-5V input. I tried using a voltage divider on the output of the analog circuit to bring the voltage down to a 0-5V square wave, which worked fine. When I connected the output of the voltage divider to a digital device, the result was the signal was offset. The offset was high enough that the digital device was only reading a high instead of a changing signal.
    I used seperate power and grounds for the analog circuits.
    Any ideas on how to fix this?
  2. jcsd
  3. Mar 6, 2006 #2


    User Avatar

    Staff: Mentor

    Have you figured out the source of the offsets yet? Look at the input current specications for the digital logic, and the output drive specs for the previous analog buffer stage.

    Instead of a voltage divider, you could instead just use a clipper circuit. A resistor in and a diode clamp to 5V will get you a maximum of 5.6V or so. Just be sure that the digital logic input isn't inclined to latch up with the extra 0.6V input for the high signal.

    You could also use something like an open-collector stage to convert from the 15V domain to the 5V domain. You'll get an inversion out of an open-collector stage, though, so keep that in mind.
  4. Mar 6, 2006 #3


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    You might need to buffer the output of the voltage divider, or use larger-value resistors.

    - Warren
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook