# Information from uncertainty

1. Jul 23, 2014

### friend

In a communication channel there is the time-bandwidth product limiting the amount of information that can pass through the channel. The math for Heisenberg's Uncertainty Principle is very much similar to the math for the time-bandwidth product. I wonder if this allows us to discover by analogy if there is some underlying limit to information in a quantum system. And I wonder if this tells us what would be the analog to a bit in physics.

2. Jul 24, 2014

### friend

I'm a little confused because I'm used to thinking of information associated with probability distributions. And so I can understand information being linked with Heisenberg's Uncertainty Principle, since that involve the variance of the position and the variance of the momentum. These variances are with respect to probabilities of measuring the particle's position and momentum. So we're speaking of probability distributions from which information can be derived.

But I don't understand what a pulse-width or a band-width has to do with a probability distribution in order to calculated it's information content. Yet, if I'm not mistaken, this is the context in which information theoretic concepts were derived.

Any insight into all this would be appreciated.

Last edited: Jul 24, 2014
3. Jul 24, 2014

### atyy

The Heisenberg uncertainty principle is exactly the same as the time-bandwidth relation in signal processing. The Wigner function in quantum mechanics is related to the Wigner-Ville and Cohen class transforms in signal processing.

There is are quantum analogues of the Shannon mutual information and other classical information theoretic concepts. The quantum mutual information is the entanglement entropy at zero temperature.
http://arxiv.org/abs/quant-ph/0102094
http://arxiv.org/abs/1106.1445

4. Jul 25, 2014

### friend

Thank you for sharing. Although, the papers you reference seem to be interested only in information capacity and processing. I think it may go deeper than that, and perhaps more directly.

For example, I have to wonder if the physical analog of information capacity limits of a channel might ultimately be responsible for the speed of light/information. Some people think that gravity may be an entropic emergent property. And so far, it seems the most relevant connection between information and physics is the similar formulations between the pulse-width band-width product and the Heisenberg uncertainty principle. Does anyone know of any papers that more directly explore this connection?

5. Jul 25, 2014

### atyy

http://arxiv.org/abs/quant-ph/0603121 or http://arxiv.org/abs/1309.2308? The "Lieb-Robinson bound" is an emergent "speed limit", like the speed of light.

Last edited: Jul 25, 2014
6. Jul 26, 2014

### friend

Yes, it seems spacetime is the medium in which physical events happen that requires information to describe. Some think that physical reality comes from pure information; they say "it from bits", etc. And I'm beginning to wonder about this as well.

So perhaps spacetime is analogous to memory, and particle arrangements and interactions are analogous to the information stored in that memory. So this brings me to wonder about why spacetime would be created to begin with. And I'm reminded, if I recall correctly, that the act of erasing memory in and of itself increases entropy. But erasing memory in effect is creating more storage capacity in which other information can be stored.

So is the expansion of space (during inflation and now the acceleration of expansion) a necessary process to increase entropy. Perhaps if nothing else is happening to increase entropy, then space expands to increase entropy. I'm not sure how to go about researching these ideas. Are there any papers out there that incorporate all this? Thanks.

7. Jul 28, 2014

### friend

As an interim calculation, does the pulse-width band-width product imply a propagation velocity of the signals that carries the information down the channel?

8. Aug 17, 2014

### friend

9. Aug 17, 2014