# Earth's Internal Heat Source

1. Mar 24, 2018

### Facial

~~~~~~~~~~~~~~
1. Heat left over from the planet's initial formation. In the early 19th century Lord Kelvin estimated the temperature based on a homogenous sphere of uniform initial temperature of 3900K, and estimated (erroneously) 10-100 million years as the age of the earth. Accounting for convection to match observed geothermal gradients pushes the estimate up as much as 3 billion years, using the same initial temperature of 3900K. Planetary accretion generates temperatures in excess of 10000K, although some of this is lost to space.
https://websites.pmc.ucsc.edu/~pkoch/EART_206/09-0108/Supplemental/England et 07 AmSci 95-342.pdf

2. Heat generated by radioactive decay. Neutrino studies from Earth's interior, or "geoneutrinos" are validated by comparing their velocity spectra with those emanating from known decaying isotopes in nuclear reactors. Estimates place the total heating power of radioactive decay at about 20TW, about half of the heat flow leaving the earth's interior.
https://authors.library.caltech.edu/25422/1/Gando2011p15815Nat_Geosci.pdf
~~~~~~~~~~~~~

So now, depending on the source, I am left with the impression that the geophysical community is not in unanimous agreement about which proportions of these two contributions are responsible for Earth's interior heat. Some of them say that planetary formation is the dominant one, while others say that radioactive decay is more responsible after billions of years.

My own (uneducated) opinion is that planetary formation is still the dominant source because the heat from radioactive decay is compared to that merely leaving Earth's interior, not its total heat content. The estimate of 3 billion years is already impressive given that the initial temperature Perry used the same initial temperature as Lord Kelvin, which we now know is far too low. Just asking because I've been really confused about this issue for a while.

2. Mar 24, 2018

### 256bits

Citation?

3. Mar 24, 2018

### phyzguy

Well, I don't really know, but the Wikipedia entry says:

"Estimates of the total heat flow from Earth’s interior to surface span a range of 43 to 49 terawatts (TW) (a terawatt is $10^{12}$ watts).[9] One recent estimate is 47 TW,[1] equivalent to an average heat flux of 91.6 mW/m2, and is based on more than 38,000 measurements. The respective mean heat flows of continental and oceanic crust are 70.9 and 105.4 mW/m2.
While the total internal Earth heat flow to the surface is well constrained, the relative contribution of the two main sources of Earth's heat, radiogenic and primordial heat, are highly uncertain because their direct measurement is difficult. Chemical and physical models give estimated ranges of 15–41 TW and 12–30 TW for radiogenic heat and primordial heat, respectively, and recent results indicate their contributions may be roughly equal."

So I would say that while the two sources are of roughly equal magnitude, nobody really knows which one is dominant. I doubt you'll find a better answer, but you can probably find many opinions.

Last edited by a moderator: Mar 25, 2018
4. Mar 25, 2018

### Facial

This is a response to a question in Scientific American given by Dr. Quentin Williams who specializes in the deep interior. I searched "Why is the earth's core so hot" on Duckduckgo. (I'm not sure if current posting guidelines allows me to give links to periodicals?)

5. Mar 25, 2018

### rootone

It's OK to give links to respectable popular media like Scientific American, where most of the articles refer to real scientific research.

As far as I know everyone agrees that the internal heat of Earth exists because:.
1 It started hot and it is only losing that internal heat quite slowly.
2 Radioactive decay helps it to stay hot.
Since we can't actually look at the core and see how much radioactive stuff is there,
we can't know exactly how much heat it generates, but it is probably a significant amount.

Last edited: Mar 25, 2018
6. Mar 26, 2018

### Ophiolite

The discussion so far has overlooked the importance of the so-called iron catastrophe. In this rising temperatures from abundant radioactive elements in the early Earth led to melting of iron and nickel that, because of their greater density, sank towards the centre. The gravitational potential energy thereby released increased temperatures further.

Brief wikipedia article.

7. Mar 26, 2018

### phyzguy

Isn't this just another component of the primordial heat?

8. Mar 26, 2018

### 256bits

Here's the article
https://www.scientificamerican.com/article/why-is-the-earths-core-so/
The article does not say the method to calculate the stated 10000K for the accretionary process, which is what I would like to know how that is estimated.
I understood from previous literature that the agglomeration of smaller bodies to form the earth gave only a few 1000's of degrees temperature.
Of course it all depends upon the kinetic energy of the smaller objects, the time frame, and the resulting radiation of some of the heat.
All of which I would think involves some guesswork.

Just using some back of the envelope figures,
Me= 1 x 1024 kg
V = 5000 m/s
KE = 1.25 x 1038 J
Silicon Cp = 1 kJ/kg/K

then the temperature rise is close to the 10000K
If the KE is less, or more, reflected in the V ( m/s ), then that changes the ΔT

9. Mar 27, 2018

### Ophiolite

No. The primordial heat is equivalent to the "Heat left over from the planet's initial formation", referenced in the OP. i.e. it is heat derived from the accretion process wherein kinetic energy is converted to thermal energy.

The iron catastrophe follows the completion of that process. (In one sense the process is never complete. We still accrete around 50,000 tons of matter a year.) The consensus view is that radioactive heating led to large scale melting after major accretion ended. It was at this point that gravity segregation of the iron and nickel occurred. The process is distinct and that distinction imparts a quite different character to the planet than would otherwise have been the case.

The importance and nature of the iron catastrophe have been known for well over half a century, so it is difficult to find a recent review paper that sets out the details. The concept evolved over time. For example, Harold Urey remarked, in a 1951 paper, "The gravitational energy due to the formation of the core during geological time makes it difficult but not impossible to account for a solid earth, if it is added to the radioactive energy." While, by the early 70s, the authors of this paper (abstract only) could state with confidence " . . .the change in gravitational potential energy increasing the average temperature of the Earth by some 2,000 K. In these circumstances core formation would be the major event in the thermal evolution of the Earth."

10. Mar 27, 2018

### 256bits

11. Apr 17, 2018

### Timber

Are we ignoring the Theia-Gaia collision which evidently destroyed probably 75% of the primordial crust (the remaining 25% being the extant continental cratons, more or less), formed the Moon, and the quite significant tidal kneading from the Moon's orbit?

12. Apr 18, 2018

### Ophiolite

Not ignoring it: the collision is implicitly part of the accretion process. I avoided referencing it directly, since it had the potential to drive the thread off-topic.

My impression was that the impact had destroyed the entire crust. I would be interested in any references supporting only partial destruction.

The contribution of tidal heating has been overlooked by everyone until now.

13. Apr 18, 2018

### Timber

<<My impression was that the impact had destroyed the entire crust. I would be interested in any references supporting only partial destruction.>>

My reference is the existence of the continental cratons as they are, and the oceans, that unless one has a better explanation (fractional differentiation, anyone?), they exist as they are as a result of the hypothesized Theia-Gaia collision. Continents only cover a quarter of the globe today, but due to collisions and accretion (the Indian subcontinent into Asia, uplifting the Himalayas, for example), the original proportion may have been somewhat different. Current continental coverage is about 40%. We don't have any other planetary model of lighter, but ancient landmasses adrift among the thinner shells of much more plastic ocean floor plates, no part of which are more than 250myo (1/16th the known age of the Earth), and no theory of any other similarly energetic event capable of destroying the greater part of an original, homogenous crust covering the entire planet (closer conforming to other observable planetary models).

Some sort of differentiation between continental crust and a much thinner ocean floor seems to have existed from the earliest possible date. That the continents weren't simply uplifted ocean floor, or the result of flood basalt volcanism for the most part. Felsic versus mafic. The Deccan Traps for example covered existing continental masses, but did not create them. Our main examples of land mass orogeny today seem to be either based on rifts, as in Iceland, or on mantle plumes, as in Hawaii. Neither of which are remotely continental in scale.

And doesn't fractional differentiation depend just as much on the Giant Impact hypothesis? Whether the existing continents were crustal remnants or not?

BTW, I'm of the opinion that we've been drastically underestimating the contribution of tidal kneading. Pre-flyby, I'd predicted the possibility of liquid water on the surface of Pluto based on that body's extremely tight orbit with Charon, and I don't think I was far off.

14. Apr 18, 2018

### rootone

Seriously?

15. Apr 19, 2018

### Ophiolite

I was hoping for a citation to a peer reviewed paper that postulated survival of a portion of the crust in the moon forming impact, rather than a speculation, intriguing as it is. I'm interested in this period in Earth history and do not recall any suggestion, in any of the literature, that some of the crust survived. Although I concede that more recent work that points to an oblique blow, such as Barr, On the Origin of Earth’s Moon 2016, render your suggestion plausible.

You seem suspicious of fractional crystalisation as method for crustal formation. It seems, generally, to satisfy most of the geological community. Do you have a specific objection? This 2014 paper offers a reasonably current view of crustal formation. I don't believe surviving cratons have any place in it.

In regard to tidal heating on Pluto, I seem to recall reading a post New Horizons paper that dismissed its significance. I'm presently trying to locate it and will post a link if succesful.

Edit: No luck on that paper yet, but I did locate this comment from a 2014 pre-New Horizons study. (lines 414-415):
In particular, tides raised by Neptune on Triton are important, while tides raised by Charon on Pluto are expected to have almost no effect.

From Nimmo & Spencer, "Powering Triton's recent geological activity by obliquity tides: Implications for Pluto geology"

Last edited: Apr 19, 2018
16. Apr 27, 2018

### Timber

We have routinely underestimated the caloric value OVER TIME of tidal kneading by a large factor. To me, it's obviously why the Earth has such a hot interior (our Moon) and Mars such a now cold interior (probably "ate" a major moon that didn't orbit fast enough, according to recent theory - perhaps the original of Hellas Planitia), while Pluto, in a tight orbit with Charon, shows clear evidence of significant recent surface melt cycles, and Europa, in Jupiter's embrace, has a very large and turbulent salt water ocean.

My bad math leads me to conclude that the extant continents, more or less (because SOME fractional differentiation does occur, plus lots of volcanic orogeny) were a part of the original crust, mainly because the amount of iron they contain. If they had arisen much later from sea floor sediments, we could expect then to be much less iron-rich than they are, because their foundation material by then they would've been post- "iron catastrophe," (in which nickel and iron fell to the core from the mantle).

An alternate theory might hold that the original crust was 100% destroyed, the iron contained in it remelted and swallowed by the core, but that the new continents which formed by fractional differentiation were replenished by iron meteorites during the Late Heavy Bombardment. That would still presuppose a very early origin for the continents, which would thereafter remain quite stable.

Which brings me to the obvious point that there are parts of some continents (Australia) that are proven to be extremely old - 4.375 billion years, plus or minus only 6m years by zircons - very solid peer-reviewed science. The Theia-Gaia collision is theorized to have taken place almost contemporaneously. So to convince me of fractional differentiation, you'd need a very rapid process indeed, which subsequently stopped and hasn't restarted since, through Pangea and everything. Obviously, to proceed further with this conversation the "missing link" is a similarly tight date for the Theia-Gaia collision, so that a more coherent sequence can be established, and theories ruled in or out as a result.

https://www.livescience.com/43584-earth-oldest-rock-jack-hills-zircon.html

https://en.wikipedia.org/wiki/Giant-impact_hypothesis

17. Apr 27, 2018

### Timber

PS just lightly read the paper (yes, I should've carefully read it before responding). My read of it is that their microplate aggregation theory is not incompatible with mine, which only postulates that the bulk of the extant continents were original crust which survived the Theia-Gaia collision, however and whenever that happened. It's quite likely, almost a certainty, that a significant fraction of the post-impact surviving crust would've been shattered into rather small pieces, that would in time aggregate into cratons regardless, if for no other reason than ocean plate motion driven by rift orogeny. It's well known that most of the western US was assembled this way.

18. Apr 27, 2018

### Timber

19. Aug 15, 2018