Hi guys, this is my first post. I was just wondering what the exact relationship between current and voltage is. Transformers work on the basis that P = IV so by increasing the voltage, they decrease the current and less thermal energy is lost etc... We can test this out by making V the subject of the equation: V = P/I if V = 100J/20A ∴ V = 5V Here, I have started with a current of 20 amps and a power of 100 joules. The resultant voltage is therefore 5V. Now, if we decrease the current and keep the power the same, we see an increase in voltage, as expected: V = P/I if V = 100J/10A ∴ V = 10V HOWEVER, the equation for voltage seems to contradict this: V = IR ∴ V = 10A X 10Ω ∴ V = 100V Here, I have started with a current of 10 amps and a resistance of 10 ohms. The resultant voltage is therefore 100V. Now, if we decrease the current and keep the resistance the same, the voltage decreases: V = IR ∴ V = 5A X 10Ω ∴ V = 50V Therefore, when using Power equation, as current decreases, voltage increases. The Voltage equation on the other hand exhibits exactly the opposite effect. Please can somebody explain why they don't agree with each other (I may just be being thick but I'm only studying GCSE physics.... I'm no physicist - by the way sorry for the long post) Thanks guys.