My son's computer science teacher claims that there is no way to devise a computer algorithm that can generate a truly random sequence of numbers (only a pseudo-random sequence that ultimately repeats). Yet there are algorithms with a finite number of steps that generate the decimal digits of irrational numbers, such as pi. Statistical tests would seem to show that these digits are truly random (e.g., any digit is statistically independent of all the previous digits), but can this be rigorously proved? If so, this would seem to contradict the teacher's statement. Is he wrong?