Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Conditions of igneous rocks on the early Earth

  1. Dec 30, 2011 #1
    I have a question that's related to radiometric dating. I would like to know what the condition would be of igneous rocks in their early states. What I mean is the state in which the radioactive elements found in the rocks, which are used for dating, have not yet undergone any decay. For example, a zircon containing uranium, in which none of uranium had yet decayed into lead. Would these rocks be physically/chemically different in that early state compared to how they are today? Would they give off quite a bit more heat?
  2. jcsd
  3. Dec 31, 2011 #2
    If there is no decay, why do you think they would give off more heat?
  4. Dec 31, 2011 #3
    Okay, maybe not heat. But what about radiation? Would these rocks emit significantly more radiation if there was a much greater percentage of the original unstable isotope?
  5. Jan 2, 2012 #4
    Well, there are different ways of seeing your question.
    First of all, let's clarify a few things... A rock is made up of various minerals. Magma cools to form igneous rocks. Magma that is formed by melting of the Earth's mantle is about 1200 C and if it doesn't erupt, its composition changes as it cools. In other words, certain minerals form at higher temperatures. Other minerals form as the temperature changes. The result is that the composition of the magma changes during cooling. So yes, igneous rocks look very different when they form by eruption or solidification of magma that has not cooled much compared to when they form from magma that has cooled for a while. Google and compare "basalt" (the composition of 1200 C magma formed by melting of the mantle) and "rhyolite" or "granite" (the composition of magma that is cooler, about 800 - 600 C).

    Zircon does not form until the magma is cooler. The lowest temperatures at which magma can exist are about 600 C.

    Now would a ZIRCON crystal (a mineral, not a rock) be chemically different at the start of the U decay process? Yes. The uranium content would be higher. At the end of decay, there would be less U because the unstable U would have decayed away.

    But I think what you're really asking is whether there would be a BIG difference in the crystal. There would probably not be. At the end of decay, the crystal would look pretty much the same as at the start. The reason is that U doesn't usually make up MUCH of the crystal. The U substitutes for Zr and there is usually much more Zr than U i the Earth's crust (and therefore in zircon crystals, when they form). Under a very high-resolution microscope, the crystal structure might look a bit messed up at the end of decay compared to when the crystal first formed. But to your naked eye, the crystal would look about the same at the start of decay and at the end (end = when all unstable U has decayed away).

    Finally, regarding heat output... Decay of unstable U (or any other unstable isotope) is thought to be ~constant. So the heat output would go along with the amount of U decaying. Once U had been used up (this would take a very long time for U), the heat output would be lower. But U is a special case, because it goes through a complicated decay process. For example, it can decay to Th and then Pb (and others). So the heat output would depend on these intermediate decay constants too. Most other isotopes used in dating only decay to one element.

    Let me know if anything is unclear.
  6. Jan 2, 2012 #5
    Thanks for the detailed and very helpful explanation. I have another question, which I really should have asked from the beginning. If the uranium content of zircons contained within igneous rocks was much higher on the early Earth, wouldn't this mean that there were much higher levels of radiation? If so, how much radiation in comparison to present levels in the environment?

    A second and probably more involved question is, how did the uranium get trapped within zircons as well as other minerals where it is commonly found? Also, when did the uranium begin to undergo radioactive decay? Would it not have begun decaying the moment it was formed in a supernova? If so, then that would seem to throw off the calculations for the ages of the rocks that the uranium would later get trapped inside.
  7. Jan 2, 2012 #6
    I am about to go and pick someone up, so I may not do your question justice. Make sure to ask if anything needs further clarification and I'll get back to you tomorrow...

    As far as your question about the U content of zircons in the early Earth, the U content probably wouldn't have been that much different than it is in zircons formed today, in my opinion. There are two reasons. Both reasons can be summed up as: there is no shortage today of U for zircon crystals.

    1) U SUBSTITUTES for Zr in zircon crystals. Also, zircon is not a ubiquitous mineral; it is only ever present in small quantities in granitic-like rocks. So U is not a required or important (by mass) part of zircons. So what happens to U if there are no zircon crystals around to take it in? There are other (rarer) minerals/melts that U goes into. U is even a main component in some.

    Just to clarify, Zr is the element, zircon is the mineral. The formula for zircon (the mineral) is ZrSiO4. Notice that the formula does not mention U. This is because U is only present in trace amounts, when it substitutes for Zr.

    2) A special thing about U compared to some other isotopes used for dating in geology is that it mostly takes a long time to decay. Let me clarify.

    There are three main U isotopes. The main one (>99% of all U found in nature) is 238. It has a half life that is about the same as the age of the Earth. If you like, we can talk more about what that means later. For now, know that it basically means that it takes most U (because most of it is 238) a very long time to decay. That means the amount of U hasn't changed a huge amount since the Earth was formed.


    Your other question is related to a very important issue in isotope geology. How do we know when our clocks started? Zircon crystals do not readily accept the daughter product of U decay (Pb) into their structure. So we assume--based on many observations of zircon crystals in the laboratory--that when zircons crystallize, they ONLY contain the parent product--some kind of U isotope. If you are interested in the mathematical details, we can talk about those tomorrow or whenever.

    Another detail you might be interested in is that because there is more than one type of unstable U isotope, we can double-check ages. For example, 238 U decays to 206 Pb. At a different rate, 235 U decays to 207 Pb (half life is about 700 million yrs).


    As far as radiation, there probably would have been some, but not a lot more, radiation than there is right now because of the long decay constant of U.


    You asked how U gets into the crystal. Like other elements that make up the crystal, it forms part of the crystal structure. Imagine ice forming from H2O liquid. As the temperature drops, ice becomes the stable form of H2O and the H and O arrange themselves into a solid lattice. It's the same with the components of zircon.


    In terms of heat loss, the Earth has cooled by conduction through its outer hard shell and convection within the mantle and core. Some heat is produced through radioactive decay of U, Th, and K isotopes. But the Earth was hotter when it first formed.
    Last edited: Jan 2, 2012
  8. Jan 9, 2012 #7
    Thanks for the answers. What I would really like to understand at this point is how we know when the uranium and other radioactive elements began to decay? If uranium and other heavy radioactive elements were created in a supernova, wouldn't they begin decaying immediately after their creation? If not, then when would they begin to decay? I don't know much about nuclear physics so I'm not sure if radioactive elements are always decaying, or if that process is dependent on certain conditions.
  9. Jan 10, 2012 #8
    Of course the uranium etc started decaying the moment it was formed.
    The point is that the products of decay are not constituents of the uranium containing mineral we now see.
    So the mineral crystal must have formed from uranium that had not decayed at that time.
    One formed, no further uranium can be added to or subtracted from the crystal by processes other than radioactive decay.
    So the crystal starts off life with a known amount of uranium.
    The crystal cannot form from the products of uranium decay or it would be a different mineral.
    The uranium within the crystal then decays in the normal manner producing products.
    The fate of these products has already been discussed by others, but we have to assume they are not leached out or transported by other means from the mineral ie they stay within the mineral, but no longer active participants in the crystal structure.
    At any time later we can measure the ratio of the uranium to the decay products now knocking about in a sample and from the known rate of decay deduce the time that has elapsed since the crystal was formed.

    does this help?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook