- #1

Bulgdoom

- 4

- 0

__Ohms Law states V = I x R and Wattage = V x I__

Fine, heres the deal. I'm using standard 12V from my battery (12.7 in reality, but for simplicity lets stick with 12). If I use 4 ohms of wire, at 12V, that means the current is 3 amps. 3 x 12 = 36 Watts, sounds toasty! If I decide to double the amount of wire, the resistance is now 8 ohms, and current is 1.5 amps times voltage = 18 watts.

**4 ohms, 36 watts vs 8 ohms 18 watts, which is hotter**?

Now... I understand that more resistance will mean more heat given off, but the total power (wattage) decreases following the equation. I have always thought of higher wattage = more heat generated. When you buy a soldering gun, 40Watts > 15 watts, and similarly a 1000 watt microwave is more powerful than an 800 watt one.

**So whats the deal here**?

I looked at a few kits on the market as a basis, one being Symtec with 36 watts and 3 amps on High which is supposed to be very hot, and Oxford with 1.3 amps and 18 watts on High, which one person measured to get as hot as 124F.

Thank You.