# Attenuating a Signal

## Main Question or Discussion Point

I've got a signal coming that can theoretically reach a maximum of 400V and at around 5A. This is not an AC signal, it's DC.

However, I need to attenuate this signal down to a maximum of +- 10V. What steps do I need to take? How do I go about this? How to minimize loss of signal clarity?

Related Electrical Engineering News on Phys.org

Last edited by a moderator:
berkeman
Mentor
I've got a signal coming that can theoretically reach a maximum of 400V and at around 5A. This is not an AC signal, it's DC.

However, I need to attenuate this signal down to a maximum of +- 10V. What steps do I need to take? How do I go about this? How to minimize loss of signal clarity?
Can you say what the source of this "signal" is? That's an aweful lot of power for a "signal".

If you want to just monitor the voltage, use a voltage divider as gnurf suggests. The 5A does not go through the voltage divider, though. Only a small current will go through the divider.

If you need to know both the voltage and current, then you would use a voltage divider to measure the voltage, and a current sensor (for DC you need to use a Hall Effect probe, or a very small resistance value "shunt" series resistor that you measure the differential voltage across).

vk6kro
[PLAIN]http://dl.dropbox.com/u/4222062/level%20shift.PNG [Broken]

This is a simulation, but it may give you a starting point. You may have to devise a voltage divider to give you +13 volts. I'd make it variable so that you can adjust the output level.

Also, R4 may need to be varied a little to get a suitable amplitude in the output signal.

I have split up the input resistor into R3, R2, R1 and R7 because of the large and dangerous input voltage.

Last edited by a moderator:
I've got a signal coming that can theoretically reach a maximum of 400V and at around 5A. This is not an AC signal, it's DC.

However, I need to attenuate this signal down to a maximum of +- 10V. What steps do I need to take? How do I go about this? How to minimize loss of signal clarity?
If you want to step down the voltage without stepping down the power capability, that could be hard. A DC to DC switching converter is a common approach but it is going to be big size wise. I never seen one off the shelf that kind of power......You are talking about 200A output at 10V.

If you just want to divid the voltage down, then it's easy like everybody said.

vk6kro
Not all that easy. The post asks for a conversion of 0 to 400 volts to -10 to +10 volts.

So, you couldn't do that with a voltage divider.

I need to attenuate this signal down to a maximum of +- 10V.
I read that as "a maximum of approx. 10VDC", but who knows what he meant. Of course, it would help if the OP could make the effort to respond to questions that arise in his thread. The hivemind needs information to function!

Not all that easy. The post asks for a conversion of 0 to 400 volts to -10 to +10 volts.

So, you couldn't do that with a voltage divider.
If no high current required, I would consider easy. Getting hundred of of amperes, that is hard. Don't think you can get one off the shelf.

couldn't you set up a follower circuit with a precision voltage reference for an offset, then a second opamp stage to get the range you want?

i'm too lazy to go any further than that, but in addition, i'd maybe consider some kind of voltage clamp on the power rails and signal output as fubar protection. plus tweaker pots and whatever else you need to get desired accuracy.

berkeman
Mentor
The hivemind needs information to function!
Interesting term!

sophiecentaur
Gold Member
couldn't you set up a follower circuit with a precision voltage reference for an offset, then a second opamp stage to get the range you want?

i'm too lazy to go any further than that, but in addition, i'd maybe consider some kind of voltage clamp on the power rails and signal output as fubar protection. plus tweaker pots and whatever else you need to get desired accuracy.
Your "follower circuit" would need to be pretty hefty to drop 390V with 5A. I don't think a simple DC-based converter would be suitable. You need a switch mode circuit to limit the power loss.

Your "follower circuit" would need to be pretty hefty to drop 390V with 5A. I don't think a simple DC-based converter would be suitable. You need a switch mode circuit to limit the power loss.
no, i mean after the voltage divider, to re-reference it. so that 0V would now be the -10V and +10V at your max.

As long as you are not trying to get high power out of it, it shouldn't be too hard. The op need to give more specific requirement. It can be as easy as using voltage divider, an +ve to -ve voltage inverter to do it ( not efficient using voltage divider because the resistor has to discipate the power).

You need to have clear requirement or else it is all over the place. As I said, it you want high power output, you are going to have to design a custom DC to DC converter and that is hard. Anything here is just speculation.