Understanding Electricity Waste: TV vs. Reality

  • Thread starter Thread starter Nirelan
  • Start date Start date
  • Tags Tags
    Electricity
AI Thread Summary
Electricity consumption in a home is not constant; it varies based on the devices in use. The total power available (e.g., 3600 watts) represents the maximum capacity, but actual usage depends on the current draw of connected devices. When a device like a TV is on, it only consumes the power it needs, meaning excess capacity is not wasted but rather not generated. The house operates at a fixed voltage, and the current drawn is determined by the resistance of the devices. Understanding this dynamic helps clarify that power is not wasted if devices are not using it, as the supply adjusts to meet demand.
Nirelan
Messages
5
Reaction score
0
Hello,
I have recently started learning about electricity and electronics and would like to have a few things explained. I apologize if these questions are not up to your standards, as this is a serious board. If a watt is amperes times volts let's say you have 3600 watts available at home. Dosen't that leave a huge amount of power wasted if you are simply watching tv?
P.S. I am just using the TV as an example. What I really mean is don't you get a lot more wattage than is needed, a great deal of the time, and dosen't that just lead to waste?
 
Engineering news on Phys.org
The power you use is determined by your total current draw (P=I *V). Your house will use as much power as it needs up to the max stated rating. So if your house is wired with 120VAC and your power rating is 3600W you can draw a max of 30A. The wasted power is the heat given off by your TV or lights it cannot be wasted if it is not used.
 
Integral said:
The power you use is determined by your total current draw (P=I *V). Your house will use as much power as it needs up to the max stated rating. So if your house is wired with 120VAC and your power rating is 3600W you can draw a max of 30A. The wasted power is the heat given off by your TV or lights it cannot be wasted if it is not used.

I think he is confused in that he thinks the house is consuming 3600 W at all times, regardless of load.
 
If the power rating is 3600 though where is the rest of it when you use a 100 watt computer? Does your house only put out 100 watts then? The way I understood it from a diagram a friend showed me your house would still be putting out 3600 watts but you would only be using 1A or whatever it took to power your device, so that would mean 29A would be wasted right?

"I think he is confused in that he thinks the house is consuming 3600 W at all times, regardless of load."
Yes I am. I just don't understand when and how the load changes. Its not like the tv tells the house how much electricity to put out. I thought you just had an amount of watts available and drew from that pool.

parallel.gif

Lets use that diagram as an expamle. If enough electricity is being produced to power all three bulbs what happens to the excess power if only one is turned on?
 
Last edited:
Nirelan said:
Its not like the tv tells the house how much electricity to put out.

"Tell" isn't really the right word, but the point is that the house supply provides a fixed voltage, and the amount of current (and power) that any device uses depends on its internal resistance.

From Ohm's law, a higher resistance across a fixed voltage has a smaller current flowing through it, and consumes less power.

At first, don't think too hard about what the "internal resistance of a TV" means - think about simpler devices like filament light bulbs (not CFL buibs!), electric kettles and heaters, etc, where the circuit is just a "real" resistance that you can measure with an ohm meter, and an on/off switch.
 
So the fixed voltage interacts with something in the bulb and that creates the amperes?
 
Nirelan said:
So the fixed voltage interacts with something in the bulb and that creates the amperes?

Yes, and the "something" is the resistance of the filament in the bulb.
 
Nirelan said:
If the power rating is 3600 though where is the rest of it when you use a 100 watt computer? Does your house only put out 100 watts then?
Yes: the rest never even gets generated by the power plant.
 
  • #10
Ok, I'm sorry if this is difficult I think I am starting to get this. I thought an electrical line supplied 120V and the fuse made 30A then there were 3600 watts going through the line and each bulb took a portion of those watts until it equaled 3600, but if you had one bulb you would be using 60 out of 3600.

I now understand that there's just 120 volts running through the line. I'm just not clear on amperes. I saw a diagram that said the devices divide the number of amperes available in the circuit. I hope you can understand this.
 
  • #11
Nirelan said:
Ok, I'm sorry if this is difficult I think I am starting to get this. I thought an electrical line supplied 120V and the fuse made 30A then there were 3600 watts going through the line and each bulb took a portion of those watts until it equaled 3600, but if you had one bulb you would be using 60 out of 3600.

I now understand that there's just 120 volts running through the line. I'm just not clear on amperes. I saw a diagram that said the devices divide the number of amperes available in the circuit. I hope you can understand this.

Here is an analogy only to help you see the picture of what's going on, so please don't take it as a direct comparison. Think of a cup full of water with a straw in it, and the water represents the energy your house can "drink". The water level in the cup represents the voltage. So a full cup means that the voltage is at 120 V, similarly if you drank a quarter of the water out of the cup, the water level would be at 90V.

Now, when you use only a little power, like a light bulb, imagine drinking a small amount of water flowing out of the cup, through a straw, and into your mouth. Now, imagine you turn on everything in the house. Now you're drinking a large flow of water through the straw like you're really thirsty. Both times, the water level in the cup is going to start to lower, unless you have a waiter there to constantly keep your water filled to the top at 120V. The waiter pouring more water in represents the power company regulating the water level to be a constant 120V, so when you drink a lot really fast, the waiter pours new water in a lot faster, and when you drink slowly, the waiter doesn't have to pour much new water in.

Now, in a city, imagine that the cup has hundreds of straws in it and each person is drinking from the same cup. The power company still has to keep that water level filled up to 120V, so its constantly pouring water into the cup as others are drinking the water out.

The way the power company makes sure your voltage level is always at 120V is a different discussion and more complicated than I'm describing it, but basically they burn more fuel when you start to pull more current.

Now the fuse being 30A doesn't mean that its constantly drawing 30A. The 30A number just means that the fuse will shut your power off if more than 30A flows through it, because that means something went wrong since nothing in your house is supposed to draw 30A and the fuse is protecting you in the only way that it can by completely cutting off current flow. The fuse does not set the current. To be really silly, back to the analogy, if you are drinking so fast that your throat and stomach can't keep up with the amount of water coming in, and you start to choke so that your throat closes so you don't choke on all the water, this is sort of what a fuse does with current. . it stops current flow when it gets so high that its out of control.

Sorry in advance if anyone finds offense to this analogy haha.
 
Last edited:
Back
Top