# Drawing voltage of a circuit in parallel

## Main Question or Discussion Point

I have created a 2 circuits, that run off a voltage produced from a proximity probe (they have their own power suplies), one that takes the voltage straight off and then outputs the voltage on a digital display, and one that needs to be stepped down in the ratio 5:1 so that the voltage can be fed into a programmable chip which gives differnet outputs for differnet voltages. each of the circuits works fine on their own, but i cannot combine the circuits so that i can have both running at the same time. how should i set this up, as my parallel circuit does not seem to work as it lowers the voltages.

thanks

Related Electrical Engineering News on Phys.org
Redbelly98
Staff Emeritus
Homework Helper
How much of a voltage drop did you get? It sounds like your voltage source does not have enough current capacity to run both circuits.

As a test, can you measure (or estimate) the current used by each circuit separately? Then test the voltage source by using an appropriate sized resistor (that would draw the combined current of the two circuits) and see how much voltage drop you get.

Make sure the resistor exceeds the power requirement.

Another question: how are you stepping down the voltage? Is this AC or DC? Hopefully you are not using a voltage divider to step down a DC voltage, but if you are then that is the problem.

Redbelly98
Staff Emeritus
Homework Helper

Is your proximity probe source actually powering the two circuits, or do the circuits have separate power supplies and they are only there to measure the voltage out of the probe?

What is the voltage, both open-circuit and under load? AC or DC?

thanks for the help, i solved the problem by massivly uping the resistor values that i measured the power off, therfore meaning that i was not getting a voltage drop anymore

Redbelly98
Staff Emeritus