What Is the Temperature Differential of Hawking Radiation?

Click For Summary
SUMMARY

Hawking radiation is theorized to originate from black holes, which possess a temperature inversely proportional to their mass. A black hole with a mass three times that of the sun has a temperature of approximately 0.0000001 K, significantly lower than the cosmic microwave background (CMB) temperature of 2.7 K. This temperature differential suggests that Hawking radiation is not practically observable in the current universe. However, calculations indicate that as the universe expands and cools, there may be conditions under which black holes could begin to radiate in the future.

PREREQUISITES
  • Understanding of black hole thermodynamics
  • Familiarity with cosmic microwave background (CMB) radiation
  • Knowledge of the second law of thermodynamics
  • Basic principles of general relativity
NEXT STEPS
  • Explore the implications of black hole temperature and mass relationships
  • Research the evolution of cosmic microwave background temperature over time
  • Investigate the conditions under which mini black holes could exist and radiate
  • Learn about the mathematical modeling of black hole thermodynamics using the Friedman equation
USEFUL FOR

Astronomers, physicists, and researchers interested in black hole physics, thermodynamics, and the evolution of the universe will benefit from this discussion.

Ranku
Messages
434
Reaction score
18
Hawking radiation is supposed to emanate from black holes because black holes have a temperature. A black hole has to be about 3 times the solar mass. A black hole with a few times the mass of the sun would have a temperature of only one ten millionth of a degree above absolute zero. This is much lower than the 2.7 K ambient temperature of the microwave background radiation.
We know that the second law of thermodynamics requires that heat only flow from a hotter to a colder body. Thus it would seem Hawking radiation would not be practically possible in the present universe - it is only a theoretical principle.
As for mini black holes in the early universe, even though mini black holes would be hotter, the early universe would also have been hotter. So mini black holes would radiate and explode only if they were hotter than the ambient temperature.

What do you guys have to say about this temperature differential issue?
 
  • Like
Likes   Reactions: Leyzorek
Astronomy news on Phys.org
Yes, this is correct, but it could be the case that the universe expanded and cooled quickly enough to allow the existence today of very small black holes that have temperature greater than 2.7 K, i.e., that are presently evaporating. Unfortunately, we have no accepted observational evidence for this.
 
George Jones said:
Yes, this is correct

So let me get this straight... What this implies is the following:

Say we have a very accurate measure of a black hole's mass. And (obviously hypothetically) say this black hole is isolated (i.e truly in vacuum so it does not accrete anything). We would not observe any mass change in the black hole until CMB radiation temperature drops below the hawking radiation temperature, at which point it will begin to radiate and lose mass?

Note: The BH will obviously eat some of the CMB photons, but I don't think that changes the situation much.
 
I also believe strongly that Hawking radiation is a non-physical solution. Could there not be a boundary at the event horizon separating it from the surrounding spacetime? This would actually cause a more true no-hair theory. If one follows this idea through, it would mean that spinning and charged black holes are also non-physical solutions. None of these things have been experimentally proven to my knowledge..
 
  • Like
Likes   Reactions: Leyzorek
Nabeshin said:
So let me get this straight... What this implies is the following:

Say we have a very accurate measure of a black hole's mass. And (obviously hypothetically) say this black hole is isolated (i.e truly in vacuum so it does not accrete anything). We would not observe any mass change in the black hole until CMB radiation temperature drops below the hawking radiation temperature, at which point it will begin to radiate and lose mass?

Note: The BH will obviously eat some of the CMB photons, but I don't think that changes the situation much.

Yes, the heat capacity of a black hole is negative, i.e., if a black hole is placed in a fridge, it warms up, and if it is placed in an oven, it cools down.

More details. The temperature of a black hole is inversely proportional to its mass. Let the temperature of the black hole be T_{black} and the temperature of an infinite heat bath be T_{bath}. Place the black hole in the heat bath. If T_{bath} < T_{black} (fridge), then energy flows out of the black hole and it loses mass, thus increasing its temperature. If T_{bath} > T_{black} (oven), then energy flows into the black hole and it gains mass, thus decreasing its temperature.
 
Thanks for the explanation, George.

So for BH's in our present universe, the rate of cooling of the CMB is likely greater than the rate at which BH's gain mass and decrease in temperature. So this would suggest at some point in the future the temperature of the BH will be larger than that of the CMB and radiation will begin. However, it is conceivable that the rates could have worked out the other way, with the CMB forcing the BH temperature down for all time. In such a universe, we would never observe a BH radiate. I suppose the last possibility would be an equilibrium point at which the CMB cools at a rate comparable to that of the decreasing BH temperature.

I feel a back of the envelope calculation coming on here. Our universe probably corresponds to possibility #1, but I'll check that first. Then, I'll see if I can figure out at what point we can expect stellar mass black holes to begin radiating.

Cheers, will post calculations later!
 
So, the closest thing I can find to anything about the CMB temperature evolution with time is the statement that it is inversely proportional to the universe scale factor. From this, I form the equation:
T_{cmb}=\frac{T_{0}}{a(t)}
Where T0=2.735 K. From the Friedman equation we have,
\left(\frac{H}{H_0}\right)^2=\Omega_{R} a^{-4} + \Omega_{M} a^{-3} + \Omega_{K} a^{-2} + \Omega_{\Lambda}
Where H is the Hubble parameter.
Now, I solve this using the following choice of parameters to obtain a(t):
\Omega_M = .314 ; \Omega_{R} = 5 \times 10^{-5} ; \Omega_{\Lambda} = .73 ; H_0 = 71 km/s/Mpc

For the temperature of a black hole we have:
T_{BH}=\frac{\hbar c^3}{8 \pi G M k_b}
Taking a 10 solar mass black hole, we have a temperature of 6.1 x 10^-9 K. Setting this equal to the Tcmb from the first equation and solving for time gives t = 3.11x10^11 yr, or 311 Gyr. For a 1 million solar mass black hole, T=6 x 10^-14 K, and t = 500 Gyr.

Note: from the first equation, you can see that when the CMB and BH temperature is the same, the scale factor is the ratio of the two. For the case of the 10 solar mass hole, at t=311 Gyr, the scale factor is 440 million!
 
BHs do not need to be more massive than the sun. Let's say somewhere in the universe someone builds an accelerator and smashes heavy ions together. They could make BHs with masses much less than the mass of the sun. They would have a very high temperatures (at least we hope they do ;) ).
 
Last edited:
Thanks guys for your reponses.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K