Was the early earth radioactive?

In summary: So the key factor that made the Oklo reactor possible was the fact that the uranium was enriched with 235U. Today, this would not be possible because the world's reactors are designed to run on uranium with a concentration of around 20% 235U.”The Earth was and still is radioactive. The reason the Earth remains so hot (internally) to this day is the heat input from the decay of potassium-40 and thorium-232 among other isotopes. I was not aware there was any uncertainty on this fact. What alternative sources of heat exist?What a complicated question! I'm not sure what you're asking specifically.
  • #1
cdux
188
0
I've been reading of the (surprising) fact we are uncertain on whether there is Nuclear Fission in the center of the Earth or not (yet we know so much detail on structures at the other end of the Universe), and I wonder, was the Earth radioactive at its early stages? And if yes, would that affect abiogenesis in positive ways?
 
Earth sciences news on Phys.org
  • #2
The Earth was and still is radioactive. The reason the Earth remains so hot (internally) to this day is the heat input from the decay of potassium-40 and thorium-232 among other isotopes. I was not aware there was any uncertainty on this fact. What alternative sources of heat exist? I believe it is a fairly straightforward calculation to show that the residual heat from the potential energy released upon the Earth's formation over 4 billion years ago is not sufficient to explain the present temperature.

But yes, we could expect the Earth to have been more radioactive in the past by the exponential nature of the decay process. Having said that we are exposed to radiation constantly, even now, due to bombardment by cosmic rays, and isotopes in the environment.

So a couple of articles you ought to read are:
http://en.wikipedia.org/wiki/Geothermal_gradient
http://en.wikipedia.org/wiki/Background_radiation

As to what effect heightened levels on radiation would have on the formation of life, I have no idea. Perhaps someone else can comment.
 
  • #3
cdux said:
I've been reading of the (surprising) fact we are uncertain on whether there is Nuclear Fission in the center of the Earth or not (yet we know so much detail on structures at the other end of the Universe), and I wonder, was the Earth radioactive at its early stages? And if yes, would that affect abiogenesis in positive ways?

The heat that drives the convection cells in the mantle of the Earth partly comes from radioactive elements in the core. Thus, the tectonic activity would not be as intense without radioactivity as it is with radioactivity.

The tectonic motion of the mantle is an important part of the “geological carbon cycle”. The carbon cycle keeps the temperature of the Earth within a range where there can be some liquid water at all times. The geological part of the carbon cycle works even when organisms are few in number.

The geological carbon cycle keeps the concentration of carbon dioxide and water within a range where organisms can live. So by keeping the carbon-cycle going, radioactive elements are producing a positive-outcome for life.

The Earth after the largest mass extinctions, and maybe the prebiotic earth, don't have enough organisms to maintain the entire carbon cycle. So the geological carbon cycle is an important stop-gap for life to continue.

However, I don't see how the direct exposure to nuclear radiation could have any "positive outcome" for organisms. Nuclear radiation usually comprises heavy particles traveling at high speeds. When a heavy particle (alpha, neutron) hits a molecule, the atom that it hits is ripped out of the molecule. So I don't think nuclear radiation can bring order to large molecules. I could be wrong, but I don't see how.


Here is a link about carbon cycles.
http://en.wikipedia.org/wiki/Carbon_cycle
“The geologic component of the carbon cycle operates slowly in comparison the other parts of the global carbon cycle. It is one of the most important determinants of the amount of carbon in the atmosphere, and thus of global temperatures.”

The average concentration of radioactive isotopes in the crust, including uranium, was much higher billions of years ago. So the radioactivity in the Earth's crust was much greater billions of years ago.

The concentration of uranium was so high more than a billion years ago that natural fission reactors formed in some localitie on the earth’s surface. Ordinary water (protonium oxide) acted as a moderator for the neutrons. A chain reaction occurred that lasted hundreds of thousands of years.

I doubt that the natural fission reactor had any “positive outcomes” for life. However, I can’t be sure.


Here is a link to an article concerning “natural fission reactors”.

http://en.wikipedia.org/wiki/Natural_nuclear_fission_reactor
“Oklo is the only known location for this in the world and consists of 16 sites at which self-sustaining nuclear fission reactions took place approximately 1.7 billion years ago, and ran for a few hundred thousand years, averaging 100 kW of power output during that time.

A key factor that made the reaction possible was that, at the time the reactor went critical 1.7 billion years ago, the fissile isotope 235U made up about 3.1% of the natural uranium, which is comparable to the amount used in some of today's reactors. (The remaining 97% was non-fissile 238U.) Because 235U has a shorter half life than 238U, and thus decays more rapidly, the current abundance of 235U in natural uranium is about 0.7%. A natural nuclear reactor is therefore no longer possible on Earth without heavy water or graphite.”
 
  • #4
I read about one natural reactor that the necessary concentration of ore was believed to have been achieved by geothermal processes. If so it can re-occur.
 
  • #5
JesseC said:
What alternative sources of heat exist?

I missed this somehow, but the answer here is friction between mantle-fluid outer core-solid inner core differences in precession rates. But the quantification is rather difficult.

Research about this isn't really "hot", but check out Cardin and Vanyo and Dunn. Also Poincare comes to mind.
 
  • #6
There is a slow loss of heat from the ultra-slow solidification of the iron core. The inner part is solid, the outer is liquid.
 
  • #7
pumila said:
I read about one natural reactor that the necessary concentration of ore was believed to have been achieved by geothermal processes. If so it can re-occur.
Not necessarily. If it was by geothermal processes, then it is far less likely today that 4 BYA.

The radioactive elements were made in a star in the final stages of its life. It was spread by supernova and concentrated by gravity. One the radioactive elements condensed in the earth, there was no new supply.

Once the Earth formed, the radioactive elements decayed for the most part spontaneously. There may have been some chain reactions in some places in the Earth where water (the moderator) and uranium came together. However, once an atom decayed there was no substitution.

Ore formed from magma. The cooling magma may have concentrated some radioactive elements as is froze solid. However, the radioactive elements had to be in the magma to begin with, while it was still liquid.

As time went on, there was less and less radioactive elements in the magma. So the chances of an ore having a high enough concentration of radioactive elements for a chain reaction decreased.

Geothermal processes are not strong enough to produce fusion. The concentration of radioactive elements in the core and mantle have been decreasing with time. So the average concentration of these elements in the ore has been decreasing.


I would never say never. Maybe there is a chance that a "natural reactor" could still occur. However, it is a lot less likely than it was 4 BYA. As we mine the surface of the Earth for uranium, it gets even less probable. Once we use up our reserves of uranium, it is very unlikely that similar concentrations of uranium ore will ever reappear.

To summarize: the concentration of uranium in magma has to have decreased since the time the Earth formed. The concentration has greatly decreased since 1 BYA, let alone 4 BYA. Therefore, the frequency of "natural uranium reactors" has to have greatly decreased. There may never be such natural reactors ever again.
 
  • #9
Borek said:
I guess you refer to Oklo natural reactor - that was active around 1.7 BYA.

See http://en.wikipedia.org/wiki/Natural_nuclear_fission_reactor
Isn't that quite a possible way to start an earthquake? Extrapolating to the frequency of such deposits deeper into the Earth I wonder if it's the most common reason earthquakes exists outside regions of high tectonic activity.
 
  • #10
See what you mean, Darwin123. I was thinking of the Oklo Gabon mine reactors too. The geothermal process involved concentrated the uranium without apparently any differential concentration of isotopes. That worked 1.7MYA but would not work now. To work now one would need differential concentration of isotopes. It seems highly unlikely so sadly we are unlikely to see a natural reactor in operation.
 
  • #11
cdux said:
Isn't that quite a possible way to start an earthquake? Extrapolating to the frequency of such deposits deeper into the Earth I wonder if it's the most common reason earthquakes exists outside regions of high tectonic activity.

The idea doesn't make sense for several reasons.

These deposits were created by the groundwater and required the groundwater to operate. There is no groundwater in deep deposits.

Amount of energy produced in these reactors is orders of magnitude lower than the energy involved in earthquakes.

Besides, I have a feeling you are thinking in terms of explosion, while the reactor just heats up, and there is no way for it to explode at the pressures involved at large depths.

Not to mention the fact that such speculations and personal theories are not allowed at PF.
 
  • #12
Ok, I'll leave, jesus.
 
  • #13
Andre said:
I missed this somehow, but the answer here is friction between mantle-fluid outer core-solid inner core differences in precession rates. But the quantification is rather difficult.

Research about this isn't really "hot", but check out Cardin and Vanyo and Dunn. Also Poincare comes to mind.

Not a great source of heat. Viscous coupling between the core and the mantle is actually pretty weak.

Paul H Roberts and Jonathan M Aurnou. On the theory of core-mantle coupling. Geophys
Astro Fluid, 106(2):157{230, Jan 2012. doi: 10.1080/03091929.2011.589028.
 
  • #14
cdux said:
I've been reading of the (surprising) fact we are uncertain on whether there is Nuclear Fission in the center of the Earth or not (yet we know so much detail on structures at the other end of the Universe), and I wonder, was the Earth radioactive at its early stages? And if yes, would that affect abiogenesis in positive ways?

The radiation caused by nuclear fission probably would not have any positive effect. The high energy particles produced would non-selectively destroy molecules. The strong nuclear interaction and the electromagnetic interaction have no preference for chirality. So the radiation from nuclear fission probably would not help in any way.

You should first understand that spontaneous radioactivity is a far more common phenomenon than nuclear fission. Nuclear fission requires very stringent conditions. Spontaneous radioactivity is, well, spontaneous. Most of the energy given off by radioactive elements under natural conditions is assumed to be spontaneous. The heat at the center of the Earth may have a large contribution from spontaneous radioactivity. But Great Krypton,wouldn't you expect it to be unstable?!

There would be plenty of spontaneous radioactivity on the early earth. Most radioactive elements can not fission, but all radioactive elements can decay spontaneously. There would be high levels of spontaneous radioactivity al over the earth. Some in the center, and some on the surface. Most of this radioactivity couldn’t help life develop.

There is a theory that beta-rays (high energy electrons) could have caused the enantiomorphic mixtures to become slightly chiral. Beta rays tend to have a slight preference in breaking down for molecules with a particular chirality because they are generated by the weak nuclear force. The weak nuclear force has a slight asymmetry in parity.

Here is an article that mentions this hypothesis.

http://en.wikipedia.org/wiki/Homochirality
“Another speculation (the Vester-Ulbricht hypothesis) suggests that fundamental chirality of physical processes such as that of the beta decay (see Parity violation) leads to slightly different half-lives of biologically relevant molecules.”

Here is a link to the abstract of an article telling about experiments to simulate this effect.
Sorry about the pay wall.
http://lib.bioinfo.pl/paper:6442363
“A brief review is presented of the Vester-Ulbricht beta-decay Bremsstrahlen hypothesis for the origin of optical activity, and of subsequent experiments designed to test it. Certain of our experiments along these lines, begun in 1974 and involving the irradiation of racemic and optically active amino acids in a 61.7 KCi 90Sr-90Y Bremsstrahlen source, have now been completed and are described.”

One student put a description of his research on this blog. I don’t think he is done with the experiment, yet. However, at least it is free.
http://theastronomist.fieldofscience.com/2011/01/universe-and-life-is-asymmetric.html
“My thesis is that the origin of life is intimately tied to radioactivity. This gut feeling of mine derives from the unexplained origins of chemical asymmetry in all living systems. There is only one of the 4 fundamental forces which is asymmetric, that is the weak nuclear force. C.N. Yang won the Nobel Prize for discovering this. Gravity, electromagnetism and the strong nuclear force are all symmetric.
It has been shown that it is possible - although only on a very small scale - to induce chirality in molecules in the presence of beta decay.”
 
Last edited by a moderator:

FAQ: Was the early earth radioactive?

What is radioactivity?

Radioactivity is the spontaneous decay of unstable atomic nuclei, resulting in the emission of radiation in the form of particles or electromagnetic waves.

Was the early earth radioactive?

Yes, the early earth was highly radioactive due to the presence of unstable elements such as uranium, thorium, and potassium. These elements were present in large quantities during the formation of the earth and their decay contributed to the heat and energy of the early earth.

How did radioactivity contribute to the formation of the early earth?

The decay of radioactive elements released a significant amount of heat, which helped to melt the early earth and differentiate it into layers. This process also contributed to the formation of the earth's magnetic field, which protected the planet from harmful solar radiation.

Did radioactivity affect the development of life on early earth?

Yes, radioactivity played a crucial role in the development of life on early earth. The heat generated by radioactive decay helped to create a warm and stable environment for the evolution of early life forms. It also provided the necessary energy for chemical reactions that led to the formation of complex molecules.

Is radioactivity still present on earth today?

Yes, radioactivity is still present on earth today, but in much smaller quantities compared to the early earth. The unstable elements have mostly decayed, but some radioactive elements, such as uranium and potassium, are still found in the earth's crust and contribute to the planet's heat and energy.

Similar threads

Replies
25
Views
13K
Replies
6
Views
2K
Replies
2
Views
4K
Replies
40
Views
4K
Replies
3
Views
2K
Replies
22
Views
58K
Back
Top