Mike2
- 1,312
- 0
I don't understand why this is not more interesting. Does this not match the concept of entropy - events and interactions increase entropy which means they reduce the emount of information in the world? Only those events which increase entropy can occur, at least in the average.Mike2 said:...And if someone trys to intercept the message it produces more noise for any subsequent receiver. What can that be except entropy, no process (of measurement) without causing an increase in entropy (loss of information). So it would seem that the act of covertly gathering information means reducing the emount of information (increasing entropy) in the rest of the signal. Which again leads me to ask: Is entropy conserved, noise somewhere because information gained somewhere else?
But we cannot know how entropy has changed (increase or decrease) unless we make two measurement of a system and see how the state has changed. We take before and after readings; we measure the initial and final states. In the act of making the first measurement, we gain information about the system. The system is then assumed to proceed in a manner of increased entropy. Then we make the final measurement. And we expect that less information is available since the entropy of the system has increased.
How does the emount of available information that can be gained about a system effect the expectation values? Expectation values are probabilistic and so is information. If a measurement randomizes the system, then you'd expect that any measurement would make any subsequent measurement less accurate. You wouldn't know it was less accurate unless you also tried to reverse the order of the two measurements. If the two measurements do not commute, then you'd expect that there would be a limit to the accuracy of the both measurements. Is h-bar a measure of information? Any thoughts or corrections, gentlement?