Ok so i recently actually learned what transistors due, when used as amplifiers (i always knew they acted as sort of switches). now my question is...how does this happen? Is it true that if i hooked up a 9v battery to the input side (lets say it puts out .1A due to resistance in the transistor and circuit), thats a total of .4W, and hook up a 12V battery to the output side of the transistor, (which in its circuit would normally have a current of .05A, which is a power output of .6W) then the input will force the output circuit's current to be .1A yielding a power output of 1.2W instead of .6W on the output circuit? (hopefully you can keep track of my explanation above). Not to mention that you can do it with voltages where output V >> input V and input A >> output A, true? It just seems like i'm not gettin something here since you're technically getting power out of nothing, any help on what im missing would be great. Thanks guys!