So you're saying that:
1. There's no computable function that converts arbitrary binary expansions to decimal expansions
2. Turing proved this
3. The article, when writing that Turing proved that no computable function returns the decimal expansion of a real, meant (1) and (2)
Alright, perhaps. Let me think about this.
Given a binary expansion* adding up to N, where the last term's (binary) exponent is n, the number is in [N, N + 2^n] if repeating 1s are allowed. For no computable function to exist to convert this to decimal form, for almost all natural numbers k there must be some real numbers x_k and y_k such that x_k and y_k differ in the kth decimal place, and such that there exists no computable f(k) such that the first f(k) places suffice to distinguish x_k and y_k. Right?
I'll need more time to think about this.
* In which every number is either 0 or 1, else the number could change arbitrarily with each additional bit.