Not another "How does a transistor amplify" thread I used to think solid state components such as transistors amplified signals. This may be correct, but it most certainly needs elaborating. Transistors themselves do NOT amplify signals. It is my understanding that transistors take an input voltage & ADD it to the supplied voltage. Assume the following, You want to amplify a guitar signal to drive a pair of loudspeakers. You're using a standard 9V. The output voltage from the guitar is 1V. So, how do you amplify (increase) the voltage? Simple. Take advantage of the 9V & add it there. So, 9V + 1V = 10V You've just amplified the signal. Is this correct? What if your input voltage exceeds that of your supplied voltage? So, assume your output voltage from the guitar signal is 10V & you're still using your 9V battery. Do the voltages add up? Or has the transistor reached saturation. Voltage adds up in series. What's stopping the voltage from doing so in this application? It's impossible to run a guitar signal in series to an amplifier? Reason I ask this is because I just made the LM386. I'm near positive the the 3.5mm jack from my lap top is what they call line level, therefore the voltage is probably high. I assume that voltage exceeds the voltage from my 9V that I'm using for my LM386 amplifier ...& yet it amplifies.