What is the norm of an operator?

In summary, the conversation discusses the similarities between information capacity limits in communication channels and the Heisenberg uncertainty principle in quantum mechanics. The concept of information in physics is explored, including the quantum analogues of classical information theoretic concepts and the idea that spacetime may be analogous to memory. The expansion of space is also considered as a means to increase entropy. The conversation also briefly touches on the idea of a "norm" of an operator in relation to Hermitian operators.
  • #1
friend
1,452
9
In a communication channel there is the time-bandwidth product limiting the amount of information that can pass through the channel. The math for Heisenberg's Uncertainty Principle is very much similar to the math for the time-bandwidth product. I wonder if this allows us to discover by analogy if there is some underlying limit to information in a quantum system. And I wonder if this tells us what would be the analog to a bit in physics.
 
Physics news on Phys.org
  • #2
I'm a little confused because I'm used to thinking of information associated with probability distributions. And so I can understand information being linked with Heisenberg's Uncertainty Principle, since that involve the variance of the position and the variance of the momentum. These variances are with respect to probabilities of measuring the particle's position and momentum. So we're speaking of probability distributions from which information can be derived.

But I don't understand what a pulse-width or a band-width has to do with a probability distribution in order to calculated it's information content. Yet, if I'm not mistaken, this is the context in which information theoretic concepts were derived.

Any insight into all this would be appreciated.
 
Last edited:
  • #3
The Heisenberg uncertainty principle is exactly the same as the time-bandwidth relation in signal processing. The Wigner function in quantum mechanics is related to the Wigner-Ville and Cohen class transforms in signal processing.

There is are quantum analogues of the Shannon mutual information and other classical information theoretic concepts. The quantum mutual information is the entanglement entropy at zero temperature.
http://arxiv.org/abs/quant-ph/0102094
http://arxiv.org/abs/1106.1445
 
  • #4
atyy said:
There is are quantum analogues of the Shannon mutual information and other classical information theoretic concepts. The quantum mutual information is the entanglement entropy at zero temperature.
http://arxiv.org/abs/quant-ph/0102094
http://arxiv.org/abs/1106.1445

Thank you for sharing. Although, the papers you reference seem to be interested only in information capacity and processing. I think it may go deeper than that, and perhaps more directly.

For example, I have to wonder if the physical analog of information capacity limits of a channel might ultimately be responsible for the speed of light/information. Some people think that gravity may be an entropic emergent property. And so far, it seems the most relevant connection between information and physics is the similar formulations between the pulse-width band-width product and the Heisenberg uncertainty principle. Does anyone know of any papers that more directly explore this connection?
 
  • #5
friend said:
For example, I have to wonder if the physical analog of information capacity limits of a channel might ultimately be responsible for the speed of light/information.

http://arxiv.org/abs/quant-ph/0603121 or http://arxiv.org/abs/1309.2308? The "Lieb-Robinson bound" is an emergent "speed limit", like the speed of light.
 
Last edited:
  • #6
Yes, it seems spacetime is the medium in which physical events happen that requires information to describe. Some think that physical reality comes from pure information; they say "it from bits", etc. And I'm beginning to wonder about this as well.

So perhaps spacetime is analogous to memory, and particle arrangements and interactions are analogous to the information stored in that memory. So this brings me to wonder about why spacetime would be created to begin with. And I'm reminded, if I recall correctly, that the act of erasing memory in and of itself increases entropy. But erasing memory in effect is creating more storage capacity in which other information can be stored.

So is the expansion of space (during inflation and now the acceleration of expansion) a necessary process to increase entropy. Perhaps if nothing else is happening to increase entropy, then space expands to increase entropy. I'm not sure how to go about researching these ideas. Are there any papers out there that incorporate all this? Thanks.
 
  • #7
friend said:
For example, I have to wonder if the physical analog of information capacity limits of a channel might ultimately be responsible for the speed of light/information.

As an interim calculation, does the pulse-width band-width product imply a propagation velocity of the signals that carries the information down the channel?
 
  • #9

What is "information from uncertainty"?

"Information from uncertainty" is a concept in science that refers to the use of data and evidence to make informed conclusions in situations where there is inherent uncertainty or unpredictability. This can include fields such as statistics, computer science, and even quantum mechanics.

Why is "information from uncertainty" important?

Uncertainty is a fundamental aspect of the natural world and cannot be completely eliminated. Therefore, understanding how to extract information and make decisions from uncertain data is crucial in many scientific fields and real-world applications. Additionally, information from uncertainty can lead to new discoveries and advancements in various fields.

What are some common methods for obtaining information from uncertainty?

Some common methods for obtaining information from uncertainty include statistical analysis, mathematical modeling, and machine learning. These techniques involve using data and evidence to make predictions and decisions, even when there is a level of uncertainty involved.

How does "information from uncertainty" relate to risk management?

Risk management involves identifying and assessing potential risks and uncertainties in a given situation and making decisions to minimize or mitigate those risks. "Information from uncertainty" plays a crucial role in risk management by providing tools and techniques to analyze and make decisions in uncertain situations.

Can "information from uncertainty" be applied to everyday life?

Yes, "information from uncertainty" can be applied to everyday life in various ways. For example, it can help individuals make decisions when faced with uncertain information, such as investment decisions or health-related choices. It is also used in fields such as weather forecasting, which affects our daily lives.

Similar threads

Replies
1
Views
1K
  • Atomic and Condensed Matter
Replies
1
Views
833
Replies
41
Views
2K
  • Biology and Medical
Replies
15
Views
2K
Replies
10
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
3
Views
1K
  • Beyond the Standard Models
Replies
14
Views
3K
Replies
5
Views
1K
  • Classical Physics
Replies
18
Views
2K
Replies
8
Views
6K
Back
Top