The first half time of radioactive carbon (14C) that got some consensus is the Libby date of 5568 years, which is still used.. See Wikipedia The latter is very fortunate, calibration tables, like the current INTCAL04, being independent of half times for determining age. So calibrated datings are 'correct' albeit with considerable margins. What is not correct however, with wrong half times, is the presumed original concentration of radiocarbon (delta14C) in the atmosphere and in the calibration table it can be seen that the concentration is assumed to increase strongly with age. This could be partly explained by the difference in atmospheric CO2 concentration, the speed of the carbon cycle and the production rate of 14C as function of the cosmic radioactivity. But a lot of those changes don't make sense and up until now, there have been little if any serious attempt to explain the strong variation in assumed 14C ratios. Could it be that the half time is wrong instead and that this accumulation of 14C is actually non existent? Such a simple explanation I did not even dare to challenge myself, expecting that determining halftimes involves advanced, well established physics, way above my perception. However: http://radiocarbon.ldeo.columbia.edu/pubs/2006Chiu.pdf Consequently, The most suitable half time for radio carbon should be 6030 years. I do wonder if such a rebellion has even a remote change to get accepted in the most conservative environment ever.