Hello, I was in a discussion with a Friend regarding cooling systems for computers. We were discussing different air cooling methods for CPUs and he claimed that if the fan was moving at high enough RPMs that the heat transfer from heat sink to the surrounding air would drop in efficiency. He claimed that if the air was moving fast enough that the heat would not have time to transfer to the air moving over it. I have been building computers for many years and I thought I was fairly versed in the design and function of the various systems but I have to admit I have never heard of anything like this. The nearest I could think of is how air heats up when you compress it but we are not talking about air compression in this case, we are talking about standard fans moving air through a grated piece of aluminum. If anyone has information about this, I would love to hear it. Frankly the concept seems very confusing.
It is true but probably not relevant. The faster the air moves, the less heat a given volume of air can absorb. That's true. But a larger volume of air is being moved, so the overall heat transfer is still higher, even though the "efficiency" is lower.
How much air would we be talking? The typical heat sink is 120mm squared and the typical fan moves anywhere from 40CFM to 90CFM. How many CFM would need to be moved before we saw diminishing returns?
I think he has it backwards. As you increase the rpm of the fan, you are not only speeding up the air, but also increasing the volume of air, as Russ pointed out. This will of course reduce the heat transferred to a given volume of air (i.e. the air will exit cooler, thus reducing the process efficiency, as Russ also pointed out). But it is key that you understand that even though the overall heat transfer into a given volume of air is less, the total heat transfer is greater because there is a greater volume of air to transfer to. That is, with more airflow you get more cooling (to a certain extent defined by the system design). Another point involved in this is heat transfer coefficient. As the transfer fluid's velocity increases, its heat transfer coefficient generally increases (to a point). This has to do with boundary layers. As convective heat transfer is defined as Qdot=Hc*A*dT, this is obviously beneficial (and one reason why forced convection is better at cooling a surface than free convection). Then, as the volume of air in contact with the cooling surface has less time to heat up, the dT remains large between the cooling surface and the immediate surrounding air. This is beneficial as well.
Hmm. Thinking about this more, since the heat dissipation of the chip is fixed, the relationship is a simple hyperbola: every time you double the airflow, you halve the temperature difference.
As russ has just pointed out, increasing the air flow rate doesn't necessarily increase the "heat transfer rate". This is a common misconception. The heat to be removed from a CPU is a function of the CPU's operating condition. For example, for a CPU operating at nominal load the heat dissipation will be fixed according to that condition and will depend on the number of calculations, etc that the CPU is handling at that instance. Increasing the fan speed increases the air flow rate and accordingly increases the heat transfer coefficient (essentially due to reduction in boundary layer thickness). In this case, if the convection equation is to remain balanced, the heat source (CPU) temperature must reduce. Convection equation: [itex]\dot{Q}[/itex]=hA([itex]T_{source}[/itex] - [itex]T_{ambient-air}[/itex]) where Q = fixed by CPU = constant A = heat transfer area = constant Tambient = air temperature = relatively constant Therefore, an increase in h (heat transfer coefficient) must lead to a reduction in Tsource