"There's no requirement that a random variable has any finite moments. It just means that care needs to be taken when applying the usual theorems. For example, the mean of any finite sample is finite but since the law of large numbers no longer applies the mean does not converge to a finite value as the sample size increases. In fact, the average of 1000 Cauchy variables has the same distribution as a single Cauchy variable!"
There is a slight risk that, if this is read too quickly, a confusion of terms and ideas will result.
1) If you take a sample of any size from a Cauchy distribution, you can calculate \bar x, so a sample mean is always defined
2) The convergence issue means this: since a Cauchy distribution does not have a mean it does not have any higher-order moments: in particular, no variance. The lack of a population means that
a) The common statement that \bar X \to \mu in probability does not apply
b) The CLT result, that the asymptotic distribution of the sample mean is a particular normal distribution, does not apply
c) It can be shown that, mathematically, if X_1, X_n, \dots, X_n are i.i.d Cauchy, the distribution of \bar X is still Cauchy - not , as we are used to seeing, approximately normal
bpet's initial comment that "there is no requirement that a random variable has any finite moments; it just means that care needs to be taken when applying the usual theorems" is spot on.
As another odd example, consider a t distribution with n degrees of freedom. For any particular n > 2 the distribution has moments of order k < n but no others.