# Interesting argument between friends

## Main Question or Discussion Point

So recently I have gotten in a heated argument with my friend about heating his apartment (no pun intended). Here is the argument:

My EE friend has a theoretical heater which runs on 1000 watts of power. In his apartment he also has another electronic device (ie. a computer, a television ect) which also requires 1000 watts to run. He argues that, due to conservation of energy, 1000 watts is 1000 watts and the same amount of heat will be produced from his "device" as from his heater (and thus if his theoretical device is always running at 1000 watts he will never need to turn out his heater to heat his house).
Being an EE AND a Physics major I disagree with him stating that the 1000 watts of energy is dissipated in ways OTHER than heat. While a heater's primary purpose is to provide as much resistance as possible (dissipating electric power as heat as heat), another device such as a computer uses its power via other methods (not via "heat") but I don't know exactly how to explain it to my stubborn friend (who just keeps shouting V=IR at me).
If I am wrong could someone please justify to me why. And if I am right please provide a good explanation of where the "power is going" so that I may show my friend and convince him that heating his house with a 1000 watt device that ISN'T a heater is a dumb idea.

Thank you,

Related Electrical Engineering News on Phys.org
Tell him to take physics again because he clearly missed the point. Yes, 1000 watts is 1000 watts, but a heater turns (most) of that energy to thermal energy whereas any other device converts some of that energy to some meaningful work. But you want an example.

So lets say we have a (massless) elevator (on earth), then the power used to do work ON a person BY the elevator is

$$P = \frac{dW}{dt} = \frac{d}{dt} \int \vec{F} \cdot \vec{dx} = \frac{d}{dt} mgx = mgv$$

where m is the mass of the person and v is the velocity. Lets say its a 100 kg person going 1 m/s. Then P = 1000 W (approximating g as 10 m/s^2). No heat transfers at all. Of course this is an ideal situation, but your friend was using an "ideal" heater, so whatever. Obviously the heater would do a better job heating things up.

Winter is here, tell your friend that he does not need a heater, just keep working on the computer, have the tv, stereo, coffee maker etc. on, he'll be warm!!!

When he is half way frozen, then tell him to re-access his pursuing EE major. Wow, this is scary!!!!

Yes, you can use Llello's mechanical work done example or many other examples. Work can be done in many forms other than heat.

AlephZero
Homework Helper
Of course posts #2 and #3 are right in theory, but your friend seems to be the better practical engineer IMO. The amount of "other forms of work" done by a computer or a TV is negligible, unless they are faulty and generating lots of RFI, or you stand them in front of a window so the light energy from the display can escape from the room

I just put my hand near the back of the PC I'm using right now (which only has a 300W power supply, not 1000) and there's a fan blowing a stream of nice warm air out of the computer case and into the room. That's exactly the same as what a 300W fan heater would do!

Can you guys provide an example of where the computer is using the energy in a form that is not thermal? I tried to use the example that a heater provides large resistance ( which dissipates the energy in thermal energy via collisions) and in a computer this is not the case (at least not nearly as much), but my friend is not convinced by this argument.

Of cause you get heat from computer, but it is disproportional the the wattage as computer is not an efficient heat generator. That's the reason I said turn on computer, tv everything.........I should have said to make up 1000W. You are not going to get the amount of heat out of a 1000W heater.

His friend is not arguing how much heat from computer etc. He is arguing about you get the same heat from 1000W of power usage by other equipments as from a 1000W heater. If you get only 200W worth of heat power, you are not getting the same as a 1000W heater.

Case in point, Christmas is coming, I put a lot of lights and moving ornaments on the tree, it's about 10 strands of 100 light bulbs, over 30 motorizing ornaments and a train set under. I don't know how much power it use, it is not low, try stand next to it and see whether you feel even the slightest of heat!!! AND I am not kidding, I take the Christmas tree very very serious!!!:rofl: I actually did paid attention whether the tree get warm, it is not even slightly warm. You'll get a lot more heat from a single 100W tungsten bulb.

Last edited:
Can you guys provide an example of where the computer is using the energy in a form that is not thermal? I tried to use the example that a heater provides large resistance ( which dissipates the energy in thermal energy via collisions) and in a computer this is not the case (at least not nearly as much), but my friend is not convinced by this argument.
How about executing few mega instructions a second, power needed to charge the stray capacitance of the signal lines inside the circuit boards to get the speed up.

Also, get a 100W/channel stereo amp and run full blast, use an ear protection and sit in front of it and the speaker and see whether you get warm!!!! I have a 3 channel 200W per channel amp, it is inside a 2'^3 confined space. After I have it on for a few hours, the inside might be like 5 degs F or so higher. If I open the door, the heat is gone.

russ_watters
Mentor
Can you guys provide an example of where the computer is using the energy in a form that is not thermal? I tried to use the example that a heater provides large resistance ( which dissipates the energy in thermal energy via collisions) and in a computer this is not the case (at least not nearly as much), but my friend is not convinced by this argument.
By not being able to think of an example, you proved your friend right: You can't think of an example because there are none. All but a negligible amount of the energy used by a computer is turned into heat.

russ_watters
Mentor
How about executing few mega instructions a second, power needed to charge the stray capacitance of the signal lines inside the circuit boards to get the speed up.
All of that is dissipated as heat.
Also, get a 100W/channel stereo amp and run full blast, use an ear protection and sit in front of it and the speaker and see whether you get warm!!!! I have a 3 channel 200W per channel amp, it is inside a 2'^3 confined space. After I have it on for a few hours, the inside might be like 5 degs F or so higher. If I open the door, the heat is gone.
Odds are it doesn't run at anywhere close to 200W, but all of the energy it produces, except what sound energy escapes your house (negligible) is converted to heat.

All of that is dissipated as heat. Odds are, it doesn't run at anywhere close to 200W, but all of the energy it produces, except what sound energy escapes your house (negligible) is converted to heat.
I said crank it up high. That is sound power, not heat power.

As for charging the capacitance of the signals, capacitor don't generate heat except the parasitic resistance when current flow that is real. The signal trace and input of a gate is not particular that lossy.

russ_watters
Mentor
I said crank it up high. That is sound power, not heat power.
Even if you crank it up to high, it still doesn't use a steady 200W/channel and the vast majority of the energy is dissipated as heat out the back, not as sound out the front. And then virtually all of the sound power is absorbed by objects in the room (and the room itself) and turned into heat. Quick google:
Loudspeaker efficiency is defined as the sound power output divided by the electrical power input. Most loudspeakers are inefficient transducers; only about 1% of the electrical energy sent by an amplifier to a typical home loudspeaker is converted to acoustic energy. The remainder is converted to heat, mostly in the voice coil and magnet assembly.
http://en.wikipedia.org/wiki/Loudspeaker

Even if you crank it up to high, it still doesn't use a steady 200W/channel and the vast majority of the energy is dissipated as heat out the back, not as sound out the front. And then virtually all of the sound power is absorbed by objects in the room (and the room itself) and turned into heat. Quick google: http://en.wikipedia.org/wiki/Loudspeaker
You always have a way to run it at high power. Another way is playing guitar and bass with solid state amp, we crank it up high. You don't get the heat of a few hundred watts. Believe me, we blast them.

And as I add onto the last post, charging into a capacitance don't generate heat in the capacitor unless you have parasitic resistance.

Try take a few computers that draw the same power as a heater and see whether you get the same amount of heat out.

Back to electronics, remember Poynting Theorem on EM energy transmitted out, there is a EXH that is power transmitted by the EM wave, and there is a thermal part that is the power loss due to resistive loss? Only the resistive loss become heat. This is physics of conservation of energy. Remember, signal traveling in circuit is EM wave propagation, there is Poynting Theorem involve.

russ_watters
Mentor
Try take a few computers that draw the same power as a heater and see whether you get the same amount of heat out.
That's one of the things I do for my job as a heating and air conditioning engineer. I figure out how much air conditioning is needed in a lab or data center by measuring or adding up from nameplate data how much electricity is going into it. Data centers are the easy ones, since the energy density is so high and the heat inputs and outputs from other sources (like the walls), so low in comparison, the only piece of information you really need to size the cooling is the wattage of the computer equipment in the data center.
You always have a way to run it at high power. Another way is playing guitar and bass with solid state amp, we crank it up high. You don't get the heat of a few hundred watts. Believe me, we blast them.
I believe you've never measured the electrical input nor the heat output of your system.

Last edited:
russ_watters
Mentor
Back to electronics, remember Poynting Theorem on EM energy transmitted out, there is a EXH that is power transmitted by the EM wave, and there is a thermal part that is the power loss due to resistive loss? Only the resistive loss become heat. This is physics of conservation of energy. Remember, signal traveling in circuit is EM wave propagation, there is Poynting Theorem involve.
And where does that EM wave go? Are they continuously being created but not going anywhere? Or are they created and then dissipated as heat?

And where does that EM wave go? Are they continuously being created but not going anywhere? Or are they created and then dissipated as heat?
No, as I explained, EM wave convert to current to charge up or discharged the capacitance of the signal trace and input capacitance of the MOSFET and those are electrical energy, not heat. Remember ideal capacitor don't dissipate power? Only part that turn to heat is the series resistance of the line or the resistance inside the capacitor WHICH we use the notion of loss tangent. This is the biggest thing about requiring power to increase speed of the computer, it's the charging and discharging the line capacitance. If you run the computer at half the speed, the power drawn will be a lot lower.

And besides, how can you justify all power transform to heat? Even a heater is not 100% efficient. You need 100% efficient to transform 1000W of electrical power to heat power. A computer is not going to be optimize to transform power into heat, neither are other equipment like motor for moving a weight......which is mechanical power, not thermal power.

For your assertion to be true, ALL electrical equipment HAS to have the same efficiency of transforming electrical power to heat............WHICH still cannot be 100%.

russ_watters
Mentor
No, as I explained, EM wave convert to current to charge up or discharged the capacitance of the signal trace and input capacitance of the MOSFET and those are electrical energy, not heat. Remember ideal capacitor don't dissipate power?
If it doesn't dissipate power then it doesn't show up on the electric meter.

And besides, how can you justify all power transform to heat? Even a heater is not 100% efficient.
For all intents and purposes, an electric heater is 100% efficient.
You need 100% efficient to transform 1000W of electrical power to heat power. A computer is not going to be optimize to transform power into heat...
As I asked before: where else does the electrical power go? Above you said it went to things like charging a capacitor, but then said it didn't.
...neither are other equipment like motor for moving a weight......which is mechanical power, not thermal power.
If the weight doesn't end up somewhere different from where it started, all the energy is dissipated as heat.
For your assertion to be true, ALL electrical equipment HAS to have the same efficiency of transforming electrical power to heat............WHICH still cannot be 100%.
If your system is a closed box and there are no chemical reactions occurring, conservation of energy demands that all of the energy become heat.

But there's no need to continue arguing this. You can pull a spec sheet for a server and see for yourself. Here's one, randomly googled: http://h20000.www2.hp.com/bc/docs/support/SupportManual/c01038153/c01038153.pdf [Broken]

Page 54:
Electrical input: 764 Watts
Heat output: 2604 BTU

According to my math, they don't quite match: converting the BTUs gives me 763 Watts, but that's close enough to consider it rounding error.

I also have similar (though not as exact) data for a lab centrifuge. Not all manufacturers give both the watts in and BTUs out though, because, frankly, they'll assume that the reader knows they are the same thing.

Last edited by a moderator:
rbj
Another way is playing guitar and bass with solid state amp, we crank it up high. You don't get the heat of a few hundred watts. Believe me, we blast them.
yung, i think you know better than this. think: conservation of energy. what sound doesn't make it out of the room, what light from the computer that doesn't make it out a window, all of that energy will eventually be converted to heat. the question is where.

i doubt more than a couple of watts will make it out. pretty much all of the power the computer and electronics draws will end up heating the room just as well as a space heater of the same amount of power.

AlephZero
Homework Helper
Not all manufacturers give both the watts in and BTUs out though, because, frankly, they'll assume that the reader knows they are the same thing.
Not to mention that the most humans measure heating power in watts anyway. Even the Brits have stopped using BTUs

So we stray from the point. In the defense of my point, I argued that a heater have as much resistance as possible. This would ideally lead to large amounts of inelastic collisions with the lattice which would generate an irreversible thermal process "heat". When an engineer designs a heater they want the conversion from electrical power to "heat" to be as high as possible.

However when an engineer designs a computer they want their conversion from electrical power to "heat" to be as small as possible. Ideally, a computer engineer would wish to design a computer which dissipates no heat (which is impossible). I believe I remember from Solid State Physics that for silicon there is a way to calculate how much energy is used to "power" a device and how much is used to heat it. If I recall it was very simple, something like if the electron requires 6eV to overcome the band gap and its given 7eV then the electron has a final kinetic energy of 1eV which is expended as thermal radiation (the leftover KE creates collisions in the lattice → heat).

Regardless of this specific example, I believe that if you think about it microscopically rather than microscopically, you see that a lot of the energy is going towards different processes that AREN'T heat. In the long run all energy WILL turn into heat (entropy ftw), however I'm talking about amounts and time scales on the order of "will this device heat my apartment more than my heater" in which case I don't think it will.

russ_watters
Mentor
Not to mention that the most humans measure heating power in watts anyway. Even the Brits have stopped using BTUs
True, but for the American market, most will Americanize their catalogs. Ironically, the science is done in SI, but the HVAC engineering is still in English units. More importantly, there is no English analogue to electrical power in watts. So my calculation spreadsheets therefore have to include both and flip back and forth.

Last edited:
russ_watters
Mentor
So we stray from the point. In the defense of my point, I argued that a heater have as much resistance as possible. This would ideally lead to large amounts of inelastic collisions with the lattice which would generate an irreversible thermal process "heat". When an engineer designs a heater they want the conversion from electrical power to "heat" to be as high as possible.

However when an engineer designs a computer they want their conversion from electrical power to "heat" to be as small as possible. Ideally, a computer engineer would wish to design a computer which dissipates no heat (which is impossible). I believe I remember from Solid State Physics that for silicon there is a way to calculate how much energy is used to "power" a device and how much is used to heat it. If I recall it was very simple, something like if the electron requires 6eV to overcome the band gap and its given 7eV then the electron has a final kinetic energy of 1eV which is expended as thermal radiation (the leftover KE creates collisions in the lattice → heat).
In case you missed it because of all the posts, here's just the bottom line:
Russ said:
But there's no need to continue arguing this. You can pull a spec sheet for a server and see for yourself. Here's one, randomly googled: http://h20000.www2.hp.com/bc/docs/support/SupportManual/c01038153/c01038153.pdf [Broken]

Page 54:
Electrical input: 764 Watts
Heat output: 2604 BTU
Heat output = electrical input.
Regardless of this specific example, I believe that if you think about it microscopically rather than microscopically, you see that a lot of the energy is going towards different processes that AREN'T heat. In the long run all energy WILL turn into heat (entropy ftw), however I'm talking about amounts and time scales on the order of "will this device heat my apartment more than my heater" in which case I don't think it will.
For a computer, "the long run" is on the order of nanoseconds to microseconds for everything except the fans, which convert their power to heat in a few seconds.

And again: you still haven't thought of such a process, have you?

Last edited by a moderator:
uart
Sorry I haven't read all the replies in this thread as I have limited time today, so apologies I end up repeating things that have already been said, but I'll add my two cents worth anyway.

The simple answer is that unless you have a device that is either,

1. Converting electrical energy in to stored energy (such as chemical or gravitational potential). For example a battery charger.

or

2. Exporting power by design. Say for example I had a 2kW motor inside the house, but exporting that power through a drive-shaft to outside in the yard where it was driving something.

then the vast majority of electrical energy consumed by any appliance (TV, Computer, Vacuum cleaner etc) *will* be converted to heat within the house.

Take the TV example given by the OP. A TV consuming say 250 watts will have all but at most a few watts converted to heat. No energy is exported or stored so therefore it must all be converted to heat + light + sound and RFI within the house (mostly heat). Though the energy involved in the light and sound will be small, most of it will be absorbed and converted to thermal energy in the walls and furnishings anyway. Typically less than one or two watts would escape the house as light or sound, and a negligible amount would escape as RFI. In summary, 99% of the electrical energy consumed by the TV would normally be converted to heat within the house!

Tell your friend he is a fool, you are exactly right. Power of a heater is designed to only heat a house/room etc. Power from a PC power source is to power a PC, heat is just an undesired side effect.

Is like he is saying a tank made of solid Uranium and a race car made of carbon fiber have the same engine, so they have to go the same speed.
That's very wrong.

Tell your friend to put on a coat and stop arguing.

Averagesupernova