Carbon dating is used for organic material, and the trick is that you assume that a living organism is constantly replenishing its supply of C-14, so the ratio of C-14 to the other carbon in the organism is equal to the naturally occurring ratio (about 1 atom in 1 trillion is C-14). However, once the organism dies, the C-14 is no longer replenished, and the ratio of C-14 to non-radioactive carbon begins to decrease as the C-14 decays. So, by measuring the ratio of C-14 to other carbon, you can tell how long ago something died. However C-14 dating can't go back very far, because the half-life of C-14 is only around 5700 years. I think the limit of C-14 dating is around 10 half-lives or so, so it can't be used to reliably date samples older than about 57000 years.
Samples of rock are typically dated using other radioisotope ratios (e.g. Ar-39/Ar-40 dating), but as far as I know, each one of those requires a similar assumption about when the "clock" for the decay started. With argon dating, it has to do with cooling of molten rock ... the red-hot liquid magma will exchange argon with the atmosphere, so while the rock is liquid, the ratio stays constant at the natural abundance. However, when the rock cools, it passes through something called the closure temperature. Below this temperature the exchange process stops, and that is the point where you start counting.