- #1
Jdo300
- 554
- 5
Hello All,
I'm working with a microcontroller and want to take analog measurements using its built-in ADC. However, I want to be able to measure voltages much higher than just 0-5V. My *seemingly* simple solution to this was to grab a 500k 25-turn potentiometer which I could connect up as a simple, adjustable, voltage divider to connect to the input of the controller. I then put a 5.1V Zener diode in parallel with the voltage divider output to ensure that the input voltage would not ever exceed the 5V limit for the IC's input. I also added a series resistor of about 1k or so on the HV side to limit the current through the zener if the POT was ever adjusted to the top of the scale when the zener breaks down (schematic attached below).
At first, this solution seemed to work well, until it came time to calibrate the divider circuit. I first wanted to set it so that it could handle up to 20V on the input. So, I set a digital power supply to 20V, and adjusted the POT to output 5V on the low voltage side. But if I scaled the voltage down to 10V, the output voltage did not scale linearly.
My initial thought was that the zener I was using was causing the problem since I was adjusting it to closely to the breakdown voltage. So I then set the power supply to 10V and re calibrated the output to read 2.5V. This worked slightly better but the setup would still go nonlinear if I went 5V above or below the value. I'm wondering if it's the zener diode that is the root cause, or if the POT itself is not linear over a range of voltage inputs?
Assuming that the diode is at fault, I could just remove it but I've already had other *accidents* without having some sort of protection circuit. Ultimately I would like to be able to safely measure up to about 50V but are there any more eloquent solutions to the problem besides my current situation?
Thanks,
Jason O
I'm working with a microcontroller and want to take analog measurements using its built-in ADC. However, I want to be able to measure voltages much higher than just 0-5V. My *seemingly* simple solution to this was to grab a 500k 25-turn potentiometer which I could connect up as a simple, adjustable, voltage divider to connect to the input of the controller. I then put a 5.1V Zener diode in parallel with the voltage divider output to ensure that the input voltage would not ever exceed the 5V limit for the IC's input. I also added a series resistor of about 1k or so on the HV side to limit the current through the zener if the POT was ever adjusted to the top of the scale when the zener breaks down (schematic attached below).
At first, this solution seemed to work well, until it came time to calibrate the divider circuit. I first wanted to set it so that it could handle up to 20V on the input. So, I set a digital power supply to 20V, and adjusted the POT to output 5V on the low voltage side. But if I scaled the voltage down to 10V, the output voltage did not scale linearly.
My initial thought was that the zener I was using was causing the problem since I was adjusting it to closely to the breakdown voltage. So I then set the power supply to 10V and re calibrated the output to read 2.5V. This worked slightly better but the setup would still go nonlinear if I went 5V above or below the value. I'm wondering if it's the zener diode that is the root cause, or if the POT itself is not linear over a range of voltage inputs?
Assuming that the diode is at fault, I could just remove it but I've already had other *accidents* without having some sort of protection circuit. Ultimately I would like to be able to safely measure up to about 50V but are there any more eloquent solutions to the problem besides my current situation?
Thanks,
Jason O