HW Helper
P: 3,682
 Quote by Hurkyl I wouldn't be so sure -- remember that, there, the author was talking about constructive analysis, which doesn't have many of the nice properties we're used to having. There, a real number is usually defined in a way similar to this: A real number is a (computable) function f that takes an integer n and returns a fraction r, satisfying the property that: |f(n) - f(m)| < 2-n + 2-m (To connect with the "usual" model of the reals, f(n) is a Cauchy sequence) We could define a "decimal number" in a similar fashion -- it takes an integer n and returns a decimal digit... and satisfies the property that there exists a bound M such that n > M implies f(n) = 0. (To connect with the "usual" model, f(n) would be the n-th place in a decimal number) And I believe that, in fact, there does not exist a (computable) function that takes a real number as input and returns a decimal number as output.
So you're saying that:
1. There's no computable function that converts arbitrary binary expansions to decimal expansions
2. Turing proved this
3. The article, when writing that Turing proved that no computable function returns the decimal expansion of a real, meant (1) and (2)