My knowledge is very limited ... both general mathematical knowledge (probably less than a math minor) and in this specific topic ... my approach is almost entirely philosophical+(just roughly understanding bare minimum). So you might want to take this just as a (very very) rough guide (because this a sub-branch within an already specialised topic "proof-theory").
Basically you can think of it in revisionist terms. The five systems (in the table in the wiki link) are increasing in their "strength". Very roughly, the idea is probably to define systems of various "strengths" and see how much traditional math can be "derived" with them.
The first system (RCA
0) can "roughly" be thought of as emulation of "computable math". But you might want to take this with certain qualifications, because (i) the term "computable math" itself has some significant variations (ii) inference rules seemingly don't respect lem (this is also mentioned in the wiki article).
The third system (ACA
0) roughly seems to correspond to being able to use real numbers (and, in a sense, perhaps computations using them?) that can be suitably be described by subsets of ℕ in AH(arithmetical hierarchy).
I think that one of the issues that arises in various reasonable "computable representations" of real numbers is that a lot of reasonable questions we might ask of them might themselves not be computable. So "computable reals" are quite sensitive to "representation" (and for technical reasons, decimal expansion doesn't seem to very favoured).
In particular (as a very specific example), if you consider the "pseudo-completeness" property which I described in this question (
https://mathoverflow.net/questions/293351), I think it fails if one uses computable decimal representations (
https://en.wikipedia.org/wiki/Specker_sequence). Incidentally, I asked the question just a few days before your posting.
P.S.
It should be mentioned that revisionist approaches are not unique. I find the answer by Prof. Nik quite interesting. Because vaguely, similar ideas had been (passively) in my mind for a number of months (though probably less than an year) ... and I was wondering whether some attempts were made (both very concrete and somewhat comprehensive at least) in this direction (my guess would have been "no").
Interestingly I searched a bit and somewhat long discussions on "foundations" forum show up (
https://cs.nyu.edu/pipermail/fom/2006-February). Note that this is over a decade old though. Roughly the two basic differing pov's are whether there should be a "philosophically coherent or distinguished position" (specifically predicative math in this discussion) or whether there is no in-between "stopping point" from very weak ideas (strict constructivism or perhaps even weaker) to very strong ones (modern sets).
There are good arguments on both sides but I should say that I simply have a hard time relating (in the sense of "philosophical coherence" ... not in interesting, motivated or probabilistic sense) that one should just "accept" P(ℕ) as an entirely understood entity (without any need for further reflection of it as a
"potentially never ending but still conceivable process
") and that one should take it on similar grounds of certainty as ℕ (and if further certainty is needed, go right back to ultrafinitist ideas). To me, ultrafinitist ideas are interesting ... but probably in a similar sense that a physical computer can be thought of as a finite automaton.
For me at least, the whole point is that "exactly" how does one think about P(ℕ). Predicative ideas do seem to try to address that. I have several differences and specific points in mind though (that could be suitable for a separate thread ... though I am not quite sure whether anyone would be interested).