What is the difference between 100V 30A and 300A 10V ??? Hello, I am very new at this so please guide me on the following issue; Lets say I have an electrical resistor with a resistivity of 1.5*10^-5 ohm m (graphite) and lets say the total resistance of this particular graphite heating element is 0.15ohm and I need it to heat up to 2000 kelvin. Considering the power required to radiate 2000 K is 3000 watt, how should I determine the required voltage and current...???? I do know that graphite heating elements are only used with dc step down transformers, but how low should the voltage go?! if I hook this up to a 100V 30Amp or 10V 300A transformers the result is 3000 Watt in either case, so what is the difference?! I think I am missing something very fundamental here... PLEASE help... The way I got 3000 watt as the required power is from the black-body radiation's formula : Power=(surface area of blackbody)*(boltzman constant)*emissivity*(Temperature)^4 is this a good method for a rough approximation (+/- 500 Kelvin) ? or correct at all? Thanks in advance.