hihiip201 said:
but other than that there's really no limit to how far i want to push a power source right? (assuming i take off all the things like current regulator and things that prevent me from melting the parts)
There are many factors that can limit a power source. For example, If you have a simple DC power supply which runs off an AC current. It could be made from a bridge rectifier and filter circuit and voltage regulator. Now the main job of the regulator is not to prevent "melting" but to prevent overloading the supply. Essentially, if you draw current at too great a rate, the filter circuit can't keep up and you don't get the nice smooth DC you probably want.
Another thing to consider is that there is no such thing as an ideal power supply (one that can maintain a constant voltage at any current level). Any real power power supply is really more like an ideal power supply that is in series with its own internal resistance.
Thus the load and internal resistance act as a voltage divider. The greater the current draw by the load, the larger the voltage drop across the internal resistance, which leaves less voltage for the load. In addition, a larger and larger proportion of available power is dissipated by the internal resistance.
For example, assume a 100v source with an 100 ohm internal resistance.
Start with a 1000 ohm load. the load will see 90.9 V and 8.26 W
If we decrease the load to 100 ohms, it sees 50 V and 25 W which is an increase in power to the load.
However, if we decrease it again to 10 ohms, it sees 9.09 V and 8.26 W, which is down from the 100 ohm load
Decreasing again to 1 ohm gives 0.99 V and .98 W
In essence, once your load tries to draw current at a rate that lowers its effective resistance to below that of the internal resistance of the source, you actually see a
decrease in power to the load.
Thus the maximum power is delivered to a load when its resistance matches than of the source.