Cooling a processor chip

  • Thread starter Thread starter sophiecentaur
  • Start date Start date
AI Thread Summary
The discussion centers on the cooling technology in the iPhone 17, specifically the use of vapor cooling, which is compared to heat pipes that utilize evaporating water to manage heat. Concerns are raised about the processor's temperature and whether it could be slowed down to prevent overheating during intensive tasks. The conversation highlights the challenges of cooling in compact devices, noting that Apple has opted for active cooling solutions to enhance user satisfaction and prevent component failure. Additionally, the impact of low power mode on performance and user experience is debated, with potential trade-offs in app functionality and video quality. Overall, the conversation underscores the importance of effective thermal management in modern smartphones.
sophiecentaur
Science Advisor
Homework Helper
Messages
30,144
Reaction score
7,403
I have been idly browsing what Apple have to offer with their new iPhone17. There is mention of 'Vapour cooling' to deal with the heat generated. Would that be the same sort of idea that was used in 'Heat Pipes' where water evaporated at the processor end and liquid water was returned from the cool end and back along a wick.

At the extreme high power end, Vapour Phase Cooling has been used in multi-kW RF transmitters where (pure) water was pumped to the Anode / or alternative Collector and the vapour passed through a cooling matrix.

The idea of that going on in my pocket is strangely exciting. But so many other features are pretty impressive. But I do wish the iPhone processor could be slowed down to avoid my leg getting burned when the phone is thinking too hard. 'Won't change, rather than' can't change, I suspect.
 
Computer science news on Phys.org
sophiecentaur said:
same sort of idea that was used in 'Heat Pipes
I would think so,
The case is the cool heat radiator.
 
If the fluid is pure water, then the capillaries that wick the liquid water back to the hot end, will need to survive ice wedging when freezing, and the enclosure will also need to be chemically non-reactive with water. In heat pipes that are exposed to freezing conditions, propylene glycol, or a simple alcohol, is added to water as an antifreeze.
 
@Baluncore I’m not sure about the electrical insulation of a mixture with antifreeze. Vapour phase cooling in a big transmitter would probably not suffer from icing. Two cooling loops would solve the problem.
It looks like some heat pipe systems may use non-pure water (if other factors don’t arise).

Do you have any knowledge regarding my iPhone query?
 
Does the iPhone 17 processor really get that hot? There are liquid coolers for desktop PC processors. I don't know the details of how they work. This recent test might interest you.
 
Inside the smart phone there is little room for convection nor radiative cooling. The last option, conduction, can be split as being either passive or active. The passive can use the circuit board or a specialized heat conductive sheet(for such a thin space) to remove the heat from the components. For the i17, an active was chosen by Apple.
If the heat from a components is not removed, its temperature will increase, and if unlucky, to a point where the component will throttle, shut down or completely fail. By adding passive or active heat removal, one is more 'sure' that the equipment ( i17) will meet expectations of user satisfaction.
Such is the reason for such tech details in the user guide where its states operating ambient temperature conditions between such and such temperatures. The active should allow usage in more extreme and prolonged higher ambient temperature conditions.

Perhaps Apple was getting complaints of product failures, and decided to up their game.
 
FactChecker said:
Does the iPhone 17 processor really get that hot?
Not a straightforward question, in fact. Years ago, someone made the point, to me that what counts is not the surface temperature of the package but the temperature of the chip. Just like with electrical resistive chains, the overall temperature drop will be the sum of the ΔT steps on the way out. Merely increasing the heat transfer capability of the cooling system can make the surface temperature of the package as low as you want but the chip itself will get hotter and can exceed the safe value.

The liquid coolers quoted by @FactChecker can allow you to over-cook a processor chip and the original makers of the computers cannot be held responsible. Neither can the kit supplier because they'll put suitable warnings on the box. Boy racers will always take risks; fair enough.
256bits said:
satisfy user satisfaction
But, as with smokers and heavy drinkers, the consumer can be 'satisfied' at the same time as doing damage. Advertising can be responsible for this. The fate of the processor is up to the way the owner treats it. I have no interest in 'clocking' my computers nor in extra cooling systems. My question was about how Apple do it in their latest phones.

I will repeat my supplementary question, made earlier, about the possibility of slowing processor speed to reduce dissipation. They already have a low power mode when the battery is low so the principle is already established.
 
sophiecentaur said:
I will repeat my supplementary question, made earlier, about the possibility of slowing processor speed to reduce dissipation. They already have a low power mode when the battery is low so the principle is already established.
What happens in low power mode to extend battery energy besides the screen going dimmer?
Do certain apps become non-operable?
Wifi is power hungry as is video, either watching a cat tik-tok, or recording someone's belly flop into a pool.
Throttling the cpu would mean less data throughput, so something has to compensate somewhere.

These could be possibilities with throttlng, paying attention to the above examples:
The car video either becomes jumpy, or of a lessor resolution, lessening user satisfaction.
Same for the belly-flop recording, and playback is of a lessor quality disappointing the user.
The user gruntles about paying a $1000 for the i17; he/she might as well have stayed with the previous model.

I guess hot phones are a problem, as PcMag wrote this:
https://www.pcmag.com/how-to/what-to-do-if-your-phone-is-overheating
They say (looking at you, Galaxy Note 7 and iPhone 15 Pro).
 
256bits said:
What happens in low power mode to extend battery energy besides the screen going dimmer?
I only quoted that to make a point that users are prepared to accept reduced performance when they need to (in this case, to extend battery life.

Each time a logic element changes state, a tiny amount of charge flows. A circuit with millions of elements, running at GHz frequencies will take significant power, even with pF capacitances involved. The clock rate will therefore govern the rate of battery consumption and (P = IV) the power dissipation. Generations of logic have achieved lower battery drain (and cooler operation) by lowering the Vcc. Lower logic voltages means worse interference between units so design needs to take this into account. It affects all levels of use; phones can last longer between charges when the supply volts are lower and Switching Centres, which consume breath-taking amounts of energy could save energy by lowering supply volts (but this introduces other problems)
 
  • #10
sophiecentaur said:
Each time a logic element changes state, a tiny amount of charge flows.

I can see your argument in all of this. When is enough, enough? Technological 'progress' continues as of in some kind of evolutionary framework. No where is it ever said "OK. Let's stop here. We have achieved . People should be happy with what they got." But there will always be that one guy who finds out that by patting his head and rubbing his tummy, an extra 1% can be squeezed out in performance. And the race is on once more.
 
  • #11
sophiecentaur said:
I do wish the iPhone processor could be slowed down to avoid my leg getting burned when the phone is thinking too hard.
As far as I know the relevant ARM 'sister' CPUs do support adaptive frequency and voltage control for some time already, so the possibility very likely exists on HW level.

It is just that Apple has that kind of thinking that 'user experience' rules all and cannot be tinkered with, so the control algorithm likely prefers performance over cold heads without any means of intervention.
 
  • Like
Likes harborsparrow and sophiecentaur
  • #12
256bits said:
When is enough, enough? Technological 'progress' continues as of in some kind of evolutionary framework. No where is it ever said "OK. Let's stop here. We have achieved . People should be happy with what they got."
It depends on the applications. For a PC, gamers have driven the demands for increased CPU and graphics capability for a long time. Currently, AI and neural networks are driving demands for increased power.
Users like myself benefit even though I never come close to really needing that capability.
IMO, those trends will impact smartphones also.
 
Last edited:
  • Like
Likes sophiecentaur
  • #13
Rive said:
Apple has that kind of thinking that 'user experience' rules all and cannot be tinkered with,
True, up to a point but Apple are never above pulling the wool over the consumers' eyes. Virtually no one ever does numerical analysis of their iPhone's performance. "User experience (and expectation)" can be manipulated from advertisement to screen. My MacBook Pro gives warnings about power consumption by individual apps and I often shut windows down as I've been advised to. I hear the fan start up now and then.
 
  • Like
Likes 256bits and FactChecker
  • #14
256bits said:
When is enough, enough?
That could be a fruitful thread on its own but we all know there is no limit, only disposable income.
 
  • Like
Likes FactChecker
  • #15
256bits said:
But there will always be that one guy who finds out that by patting his head and rubbing his tummy, an extra 1% can be squeezed out in performance
And that guy is probably an accountant (or his helper).
 
  • #16
sophiecentaur said:
And that guy is probably an accountant (or his helper).
Whereby,
that one guy who finds out that by patting his head and rubbing his tummy, an extra 1% can be squeezed out in performance profit.
 
  • #17
FactChecker said:
For a PC, gamers have driven the demands for increased CPU and graphics capability for a long time. Currently, AI and neural networks are driving demands for increased power.
I'm about to fetch a new PC for my mom (closer to 90 than to 80). Her old one (some i5 3rd gen with 4G RAM) is no longer able to properly run some facebook games (quiz...) and youtube stuff.

Games and AI it is? Really?
 
  • #18
Rive said:
I'm about to fetch a new PC for my mom (closer to 90 than to 80). Her old one (some i5 3rd gen with 4G RAM) is no longer able to properly run some facebook games (quiz...) and youtube stuff.

Games and AI it is? Really?
I'm surprised at that. For Windows 10, 4G RAM is a little tight, but should work. And I wouldn't expect processor speed to be an issue on FB. Do you know what goes wrong?

PS. With the Firefox browser I had a problem where it seemed to accumulate a lot of memory (memory leak?). I never figured it out.
 
  • Like
Likes sophiecentaur
  • #19
FactChecker said:
Do you know what goes wrong?
Bloatware meets too many layers of abstractions through browser and OS down to HW and so demanding absurd amount of CPU power even for trivial tasks.
That's all.
 
  • Like
Likes sophiecentaur
  • #20
Rive said:
Bloatware meets too many layers of abstractions through browser and OS down to HW and so demanding absurd amount of CPU power even for trivial tasks.
That's all.
In the Performance monitor, is the CPU or memory maxed-out?
I don't understand what in FB requires a lot of CPU power.
 
  • #21
FactChecker said:
I don't understand what in FB requires a lot of CPU power.
Extra memory is cheap and 4GB is really on the small side if you have Windows running. Running Windows 10 with Parallels on my i5 16MB MacBook Pro, I can see Parallels assigns about 8MB and there are no problems.

I have a friend who bought almost the cheapest PC he could find (4MB) and it fell over all the time with Zoom. Bringing it up to 8GB sorted out the problem. all for about £30.

Why not risk a punt with more memory in the existing (I can't call it "old") machine? Unless the tasks are really demanding, the OP can waste a load of money on something that's not needed. Don't listen to 'enthusiasts'; they just equate price and spec with value.
 
  • #22
sophiecentaur said:
Extra memory is cheap and 4GB is really on the small side if you have Windows running. Running Windows 10 with Parallels on my i5 16MB MacBook Pro, I can see Parallels assigns about 8MB and there are no problems.
I agree that 4GB is marginal. I recommend that the memory usage be checked using the Performance Monitor.
sophiecentaur said:
I have a friend who bought almost the cheapest PC he could find (4MB) and it fell over all the time with Zoom. Bringing it up to 8GB sorted out the problem. all for about £30.

Why not risk a punt with more memory in the existing (I can't call it "old") machine? Unless the tasks are really demanding, the OP can waste a load of money on something that's not needed. Don't listen to 'enthusiasts'; they just equate price and spec with value.
i5 3rd gen came out in 2012. Putting more memory in a 13-year-old computer doesn't seem like the best decision. I believe that a current computer can be bought fairly cheaply that has a lot more capability and will come with Windows 11 preinstalled. Windows 11 will give her computer support for many years.
 
  • #23
FactChecker said:
I agree that 4GB is marginal. I recommend that the memory usage be checked using the Performance Monitor.

i5 3rd gen came out in 2012. Putting more memory in a 13-year-old computer doesn't seem like the best decision. I believe that a current computer can be bought fairly cheaply that has a lot more capability and will come with Windows 11 preinstalled. Windows 11 will give her computer support for many years.
I wouldn't disagree strongly with that; it might make sense - if it weren't that extra memory is so cheap and how future-proof does she need the machine to be? Is she going to take up activities needing high spec? In which case a mid price new machine would probably not be future-proof. (My mate's "good value" machine was certainly not future proof.)

But the OP will probably be able to select something with a few years' capabilities. We can't foretell the future but remote power could be available for most uses.

The Windows world is full of disgruntled users who resent upgrading their systems, every time a new Windows comes along. I can sympathise, which is why I glide effortlessly from macOS to macOS, hardly noticing it's happened.
 
  • Like
Likes FactChecker
  • #24
sophiecentaur said:
The Windows world is full of disgruntled users who resent upgrading their systems, every time a new Windows comes along. I can sympathise, which is why I glide effortlessly from macOS to macOS, hardly noticing it's happened.
Linux is a good option. I have switched my old laptop to it because it didn't have the hardware that Windows 11 wants. There might be a Linux distro that is designed for older people who do not want to mess with learning a whole new OS.
 
  • #25
Rive said:
I'm about to fetch a new PC for my mom (closer to 90 than to 80). Her old one (some i5 3rd gen with 4G RAM) is no longer able to properly run some facebook games (quiz...) and youtube stuff.

Games and AI it is? Really?
It's most unlikely that PC the PC itself has slowed down unless the the ventillation is clogged or the fan has failed. It's far more likely that the bloat-ware that you don't need is what's causing problems - most likely shortage via filling the memory with junk. .
 
Back
Top