PeterDonis said:
I don't see how anything you've said has anything to do with the information paradox. Can you clarify what you mean here?
I will try to explain in shortly with some summing hints.
QM predicts quantum states (the connection to individual measurements is only probabilisitic). Premises are initial conditions and timeless laws. From this it follows that - set aside the COMPUTATIONAL TASK to actually execut the deduction, the future is equivalent to the past. So its a "dead" system, information is of course preserved. All we have are equivalence classes of histories. And the laws governing the quantum state flow is assumed timeless. (this is like in classical mechanics)
I am suggesting that the computational procesess and chaos here are a key players. With this i don't mean human made computers, it mean natural processing. You can consider the evolution of a physical system as a computation, or decoding laws of nature from experiemntal data as computation, or scrambling data in a black hole. After all, a REAL human made computer is also a physical process, so this is just a generalisation of the computation concept. Information can be lost and then reconstructed given enough data and computational resources, you need to account for TIME, to talk about information (decoding speed etc).
So my point is that randomness, chaos, and informtion contents, must be dependent on the observer, and the observers information processing capacity and learning speed. And these parts are idealized away in QM. In fact the "equivalence of future and past"
in QM is worth nothing unless the computation is actually performed. Also except in mathemtics maybe, i see no physical rational
behind concepts like "real randomness" etc. If an observer can not distinguish a signal from noise, it will be classified as noice, and in particular TREATED as noise. Ie. you will not "save noise data", it will be discarded. So there are possible behavioural predictions from this. It also seems quite resonable that the radiation from a LARGE black hole is far more hard to decode than from a microscopic black hole.
The root cause of things here is the idea that the classical obsever in quantum mechanics, serves as a FIRM ground, to FORMULATE the quantum theory. This was also the point of the founders such as bohr etc. MY point here, is that it is TOO firm, and thus blurs of discintionc between the relative of randomness. "True randomness" requires an hypothetical infinite information processing machiney to actualy infer. This we can easily "imagine" an classical observer to have, and dismiss as practical matters. But i strongly dislike this, and it think its a deep mistake
Of course these are no formal arguments but then soley serve to briefly convey (human-to-human) the connection i see to the information paradox. Ie. i THINK (can not prove it) that it makes no sense to talk about "no-hair" or perfect infromation preservation, we need to revise the theory to account for the actual computational limits. How this relates to physical parameters is a harder question, but there are already lots of papers on where one considers black holes to be "optimal scrambler" objects etc. So without having answers, it seems the MASS for sure must constrain the computational power. An massive observer at least should ahve the physical possibility to "resolve" strucuture where a lighter observer responds with treating it like noise (and this can be OBSERVER, and VERFIED by a third observer, so there is predictive potential here)
So to sum up, it seems radiation from BH might well be random relatie to small orbiting observers, as they arent meant to be able to decode. But a large observer that can consume the black hole as it radiates away, might possible decode it. All idealisation in calculations that ignores removes my confidence in them.
Anothing think relating to this is the note that the interesting various dualities betweeen theories that many poeple research, like AdS/CFT, typically has traits that relate to computational issues. That two dual theories have different computational complexity, so that in a sense they are equivalent, from the point of view of information processing one may be preferred. This is why they are also useful as mathematical tools. Another theory "corresponds" to a different way to calculate the same thing that is easier. One might thing that, this is just a mathematical curiousoty, but i do not think so. The computational requirements has everythign to do with physical processes in nature.
/Fredrik