- #1
Jiachao
- 12
- 0
Edit: My question is better expressed in my next post so ignore this post.
A power station delivers 890 kW of power at 12 kV to a factory through wires with total resistance 5.0 [tex]\Omega[/tex]
How much less power is wasted if the electricity is delivered at 50 kV rather than 12 kV?
eq-1) P = IV
eq-2) P = I2R
Ohm's Law: V = IR
I know the problem can be solved by solving for the current value in each case from eq-1, and plugging it into eq-2 to calculate power loss.
My real question is why can't we apply Ohm's law to calculate I, and then plug it into eq-2? Why must we use eq-1 to get the current? Doesn't Ohm's law apply for all ohmic conductors, and since we have both the resistance and voltage of and across the conductor, can't we get the current that way?
That was all copied and pasted from a previous thread. I've also googled around and found a similar answer: http://in.answers.yahoo.com/question/index?qid=20090220223426AAe2nSs
I think I understand the distinction between voltage supplied and voltage drop. However, I still don't understand why the voltage loss through the wire can't be calculated using V=IR.
For example, In a series circuit with Load1 and Load2,
the Voltage supplied is 12V
The Resistance of Load1 is 4 [tex]\Omega[/tex] and Resistance of Load 2 is 2[tex]\Omega[/tex]
Since V= IRtotal, I = 2A.
Hence, Voltage drop across Load1 is Vdrop across 1 = IR = 8V
Similarly, Voltage drop across Load 2 is= IR = 2 x 2 = 4V
Hence, total drop across the two loads is 12V which is equal to the supplied voltage.
Now, if you consider the wire in transmission cables to be one large load, why isn't the voltage drop across the wire be equal to voltage supplied?
I know I'm missing or confused with something but I just don't understand. Hopefully the answer can be explained with high school level knowledge (since I'm in High School).
Thanks
Homework Statement
A power station delivers 890 kW of power at 12 kV to a factory through wires with total resistance 5.0 [tex]\Omega[/tex]
How much less power is wasted if the electricity is delivered at 50 kV rather than 12 kV?
Homework Equations
eq-1) P = IV
eq-2) P = I2R
Ohm's Law: V = IR
The Attempt at a Solution
I know the problem can be solved by solving for the current value in each case from eq-1, and plugging it into eq-2 to calculate power loss.
My real question is why can't we apply Ohm's law to calculate I, and then plug it into eq-2? Why must we use eq-1 to get the current? Doesn't Ohm's law apply for all ohmic conductors, and since we have both the resistance and voltage of and across the conductor, can't we get the current that way?
That was all copied and pasted from a previous thread. I've also googled around and found a similar answer: http://in.answers.yahoo.com/question/index?qid=20090220223426AAe2nSs
I think I understand the distinction between voltage supplied and voltage drop. However, I still don't understand why the voltage loss through the wire can't be calculated using V=IR.
For example, In a series circuit with Load1 and Load2,
the Voltage supplied is 12V
The Resistance of Load1 is 4 [tex]\Omega[/tex] and Resistance of Load 2 is 2[tex]\Omega[/tex]
Since V= IRtotal, I = 2A.
Hence, Voltage drop across Load1 is Vdrop across 1 = IR = 8V
Similarly, Voltage drop across Load 2 is= IR = 2 x 2 = 4V
Hence, total drop across the two loads is 12V which is equal to the supplied voltage.
Now, if you consider the wire in transmission cables to be one large load, why isn't the voltage drop across the wire be equal to voltage supplied?
I know I'm missing or confused with something but I just don't understand. Hopefully the answer can be explained with high school level knowledge (since I'm in High School).
Thanks
Last edited: