alexmath said:
For example: Let's say a manufacturer creates a device which runs at 120W (let's say a lightbulb). Does this mean that i can use which combination of voltage and current such that V*I=120W i want?
No. The filament of the lightbulb has a set resistance. This means that there is only one voltage that gives you the right current so that V*I = 120 W.
Another example: Let's say that the maximum voltage across a capacitor is 300v. Can i run it with as high current as i want as long as i don't exceed 300v?
You don't select the current a device uses. You select the applied voltage which then determines the current as per ohm's law.
My second question is that instead of adding resistors to decrease current can't we just add a transformer instead which will decrease voltage to produce the same current?
I think you're misunderstanding how a transformer works. With a step down transformer the voltage is taken from a higher value in the primary circuit and stepped down to a lower value in the secondary circuit. This voltage value is applied to the secondary circuit and the current will depend on the resistance/impedance of the secondary side. The key idea here is that the power of both the primary and secondary sides is always the same.
Let's say that we have a transformer with a primary average voltage (Vrms) of 100 volts. The secondary steps this down to 50 volts. If the load on the secondary side is consuming 100 watts of power, then the current through the secondary circuit is 2 amps, as 50 volts * 2 amps = 100 W. However, the primary has 100 volts, so the current flowing through the primary is only 1 amp.
If we change the load so that it consumes 200 watts, then the current jumps to 4 amps in the secondary side and 2 amps in the primary side. Note that the voltage in the secondary side is set by the transformer itself and doesn't change. The current will change depending on the resistance (or impedence) of the secondary circuit, which includes the load.
If you halved the voltage going to your lightbulb by using a transformer, then the current flow through the lightbulb would also be halved and the power would drop to 1/4 what it used to be. For a 100 watt bulb being supplied with 120 volts, the resistance of the bulb would need to be 144 ohms in order to get the required 830 milliamps of current so that V * I = 100 watts. If you used a transformer to drop the voltage to 60 volts, then the current flow through that same lightbulb is now only 417 milliamps and the power falls to 25 watts. (Because we didn't change the resistance of the bulb, which is still 144 ohms. So I = 60 volts / 144 ohms, which is 0.417 amps)
Remember! We don't directly manipulate current itself! We have to change either the voltage or the resistance of something in order to get the right amount of current flow through it.