# OPAMP as integrator

1. Jan 11, 2010

### Mr confusion

hi friends,
i cannot understand the derivasion of OPAMP as an integrator. my book has suddenly replaced a term 1/s(where s=jw ) by an integral with time as variable, without any explanation.
another thing is that if OPAMP can act as integrator, then why is it not THE BEST AMPLIFIER ON EARTH? because as you let time pass by, the output voltage goes on increasing without limit.......??
thanks.

2. Jan 11, 2010

### vk6kro

another thing is that if OPAMP can act as integrator, then why is it not THE BEST AMPLIFIER ON EARTH? because as you let time pass by, the output voltage goes on increasing without limit.......??

Opamps can only produce output voltages out that are between the voltages supplied to them.

They can't generate extra voltage.

So, if an opamp has power supply voltages of +15 volts and -15 volts then it can produce +5 volts, -9 volts, but not +22 volts or -16 volts, even though the formulae might suggest they can.

3. Jan 11, 2010

### dlgoff

I think your book is probably using a http://en.wikipedia.org/wiki/Laplace_transform#Properties_and_theorems" to analyze your integrator.

Last edited by a moderator: Apr 24, 2017
4. Jan 12, 2010

### davidrit

As vk6kro said, the output will at most go to the power supply rails of the opamp.

In practical integrators you either

include a resistor in parallel with the feedback capacitor making it a "leaky integrator"

or

place a switch across the feedback and periodically reset the integrator to return the reference level.

5. Jan 12, 2010

### Mr confusion

thank you so much, friends. yes , i realized it when i did the lab today.
still cannot understand the derivasion, though. but ok., i will ask my instructor what these laplace transforms are......

6. Jan 12, 2010

### sophiecentaur

"The best amplifier in the world" would need to have a specified gain. The output voltage from an integrator will, as you say, keep increasing until it hits the+ or - limit. That's just not what you want an amplifier to do, usually. It doesn't have a static 'gain' as such, because time comes into the result it produces.

7. Jan 12, 2010

### Mr confusion

sophiecentre thank you
now, what is then the best amplifier? is it CE?/ What are the qualities that a good amplifier needs to have? -is it constant gain.?
today, our lab instructor told us that inverting amplifier is better than non -inverting ones. so i thought there must be something by which they decide the quality of amplifier apart from voltage gain?
incidentally They told us to keep terminals 1, 5, 8 of IC741 open. but then what is the use of keeping them?

8. Jan 12, 2010

### davidrit

Research "ideal opamp". Your text book probably has information.

9. Jan 12, 2010

### sophiecentaur

"inverting amplifier is better than non -inverting ones"
It is not, necessarily "better" if you happen to want the same polarity for your output as the input! ;-)

You could say that the inversion makes the inverting amplifier potentially more useful, though.

The quality of an amplifier would be how accurately like the input signal it can make its output (linearity) or you could say how little noise it introduces or you could say how much power it can deliver or you could say how much gain it has. It depends - like when a woman chooses a handbag.
Ducks to avoid flying bricks.

10. Jan 13, 2010

### Bob S

Inverting is better because:
1) Inverting configuration (with the + input to ground through input bias current offset resistor) eliminates common mode voltage offset nonlinearities.
2) Provides a good summing junction for signals from multiple voltage source signals (through series resistor) and from multiple current sources.
3) Minimizes leakage currents to neg input on the surface of pc board.
4) Easier to use a guard ring around neg input on pc board.

Bob S

11. Jan 13, 2010

### sophiecentaur

And if you want a non inverting amplifier, all you need to do is to use two inverting amplifiers. Two wrongs can make a right! But two rights can't make a wrong.

12. Jan 14, 2010

### Mr confusion

thank you , friends.