dmehling said:
I have a question that's related to radiometric dating. I would like to know what the condition would be of igneous rocks in their early states. What I mean is the state in which the radioactive elements found in the rocks, which are used for dating, have not yet undergone any decay. For example, a zircon containing uranium, in which none of uranium had yet decayed into lead. Would these rocks be physically/chemically different in that early state compared to how they are today? Would they give off quite a bit more heat?
Well, there are different ways of seeing your question.
First of all, let's clarify a few things... A rock is made up of various minerals. Magma cools to form igneous rocks. Magma that is formed by melting of the Earth's mantle is about 1200 C and if it doesn't erupt, its composition changes as it cools. In other words, certain minerals form at higher temperatures. Other minerals form as the temperature changes. The result is that the composition of the magma changes during cooling. So yes, igneous rocks look very different when they form by eruption or solidification of magma that has not cooled much compared to when they form from magma that has cooled for a while. Google and compare "basalt" (the composition of 1200 C magma formed by melting of the mantle) and "rhyolite" or "granite" (the composition of magma that is cooler, about 800 - 600 C).
Zircon does not form until the magma is cooler. The lowest temperatures at which magma can exist are about 600 C.
Now would a ZIRCON crystal (a mineral, not a rock) be chemically different at the start of the U decay process? Yes. The uranium content would be higher. At the end of decay, there would be less U because the unstable U would have decayed away.
But I think what you're really asking is whether there would be a BIG difference in the crystal. There would probably not be. At the end of decay, the crystal would look pretty much the same as at the start. The reason is that U doesn't usually make up MUCH of the crystal. The U substitutes for Zr and there is usually much more Zr than U i the Earth's crust (and therefore in zircon crystals, when they form). Under a very high-resolution microscope, the crystal structure might look a bit messed up at the end of decay compared to when the crystal first formed. But to your naked eye, the crystal would look about the same at the start of decay and at the end (end = when all unstable U has decayed away).
Finally, regarding heat output... Decay of unstable U (or any other unstable isotope) is thought to be ~constant. So the heat output would go along with the amount of U decaying. Once U had been used up (this would take a very long time for U), the heat output would be lower. But U is a special case, because it goes through a complicated decay process. For example, it can decay to Th and then Pb (and others). So the heat output would depend on these intermediate decay constants too. Most other isotopes used in dating only decay to one element.
Let me know if anything is unclear.