Our attempts at certainty (measurements) actually increase uncertainty

In summary, Landauer's principle says that the minimum amount of energy required to add/delete/alter 1 bit of information is ≥ k*T*ln(2). This means that all increases in entropy increase uncertainty. Furthermore, we are increasing our entropy and possibly bringing us closer to the heat death (big freeze) of the universe with each measurement.
  • #1
jgreavu
8
0
Here is my theory:

- The measurement of an observable by an observer is a thermodynamically irreversible process. All thermodynamically irreversible processes increase total system entropy.

- Landauer's principle (which was experimentally shown in 2012) says that the minimum amount of energy required to add/delete/alter 1 bit of information is ≥ k*T*ln(2), where k is Boltzmann's and T is temperature.

- The natural log of a number is equal to the natural log of 2 times the binary logarithm of that number. Since binary logarithms (with the argument being the number of possible states) are how we measure bits of information, we can say that there is a relationship between information and entropy.

- When I say information I mean the number of bits still needed to completely describe a system, so from here on out I'll just start calling it uncertainty. I'll suggest the following relationship:

U ≈ S / k ln(2)

where U is the uncertainty: the number of bits still needed for the observer to fully describe the observable system and S is entropy.

This means that all increases in entropy increase uncertainty. Therefore in our attempts to gain certainty about the world, taking measurements, we are actually increasing our uncertainty about it.

Furthermore, we are increasing our entropy and possibly bringing us closer to the heat death (big freeze really) of the universe with each measurement. Increases in entropy occur with expansion or temperature decreases in the universe. I do remember reading something about how measuring dark energy density or something actually increased the accelerating expansion of the universe.

In the context of many-worlds, perhaps each measurement spawns new universes and this is responsible for the increase in entropy/uncertainty?

I'm looking for holes and constructive criticism. I'd really appreciate any comments and questions.
 
Last edited:
Space news on Phys.org
  • #2
first off welcome to the forum, secondly the forum does not support personal theories. That being said. The many worlds interpretation is just that an interpretation. One that remains unsupported with any form of observational evidence, other than those dealing with quantum mechanics. Entropy does not necessarily imply disorder, Entropy is the number of degrees of freedom a system has. Its usage in thermodynamics is accountable. Count the number of particle species, including their anti-particle. Tally up the number of degrees of freedom due to spin of each particle species. This will give you the entropy density. Take photons for example as is is own anti-particle, with spin 1 this gives an entropy of 2. Their distribution is describable by the Bose-Einstein distribution, along with other bosons. The Fermi-Dirac distribution is for fermions.

As you can see Entropy in thermodynamics has a specific use. It is also used in a similar manner in Landauer theory. . Rolf Landauer recognized that it is the logically irreversible erasure of information, stored physically in the information-bearing degrees of freedom of a memory register or computer, that necessitates a corresponding entropy increase in the environment or in the non-information-bearing degrees of freedom. This is different than what you described.

At inverse temperature , the entropy increase causes heat β

[tex]\Delta Q≥\frac{\Delta S}{\beta}[/tex] where [tex]\Delta S[/tex] is change of entropy

might help if you truly look at both Landauer and manyworld metrics and fully understand what they are saying before developing a personal theory based on them.

I might also add Landauer theory is highly controversial so is many worlds interpretation as it applies to multi-verse theories

for example the above is based on the first article I pulled up on it, Some of the controversies are mentioned in it.
http://arxiv.org/pdf/1306.4352v2.pdf
 
Last edited:
  • #3
Thanks for the welcome. Sorry I didn't know the rule but that's not an excuse either.

What about BICEP2's recent discovery? Doesn't that offer a bit of support? Or the mathematical properties of eigenvalues? And that many-worlds requires less assumptions than Copenhagen? Not to mention many credible theorists believe it.

Physics will go nowhere if controversial ideas are immediately shot down.

I never said a thing about disorder. Disorder is a weird word. I said uncertainty. The number of degrees of freedom is the number of possible states. The more states a system has, the more bits required to describe it.

You must also know that that property of logarithms is true and information in bits is described by binary logs.
 
  • #4
We actually don't shoot down theories, that have no evidence of being wrong. Any theory until proven wrong is considered valid. That's the scientific method, however the forum only allows discussion of theories with supportive and peer reviewed articles. This is to prevent ppl from coming here and throwing any old theory even if not scientifically accurate at us. Discussion of a peer reviewed paper is always allowed.
 
  • #5
here,

Landauer's principle: dQ ≥ kTln2

S (for irreversible) is proportional to dQ/T

ln(x) = ln(2)logbase2(x)

and S(x) is also proportional to the natural log of the number of states or degrees of freedom

and U(x) = uncertainty, or bits still needed to describe system = logbase(number of possible states)

Therefore,

S(x) is proportional to kln(x) is proportional to ln(2)U(x)

U(x) is proportional to S(x)/kln2

Prove my math wrong.

And read the article (published in Nature) from 2012 which confirmed Landauer's principle.

Sorry to be brash I just am frustrated that no one will hear me out on this and the only response I've heard basically is "that's controversial".
 
  • #6
Last edited by a moderator:
  • #7
sorry I have no idea how to typeset equations, I am quite young and new to this but I don't want that to instantly discredit me in any way
 
  • #8
post the peer reviewed paper and I'll be happy to read it. I read numerous article even ones I don't necessarily agree with. That reading has taught me that straight mathematics is capable of any interpretation. Doesn't necessarily mean a theory is correct. If you wish to pursue your theory all the power to you, however you need to look at the full metrics involved. Just a suggestion. For example your not using the Von Neumann of a quantum state
ρ
 
  • #9
I know I am not... that's why I carefully said proportional to, to save myself from more complicated, but unnecessary mathematics for a sufficient argument. I posted the paper above.
 
  • #10
which is still S = -tr(rho ln rho) where rho is the density matrix. and the resulting entropies are positive so to that tells you that it is still proportional to the log of probability of states
 
  • #11
More broadly:
The increase in uncertainty due to taking super accurate measurements is a common effect encountered in the study of inverse problems. If the noise in the inverse problem is very small, then taking the inverse process to reconstruct the forward problem results in the noise being large. It's just what happens when you invert matrices.

It happens in everything - the effect is purely statistical: it need not cause a physical variation that would be seen Nature but it would be no surprise to see the effect show up as thermodynamics (a model based in statistics) in some situations.

It is certainly not a surprise to link "information" and "thermodynamics".
 
  • #12
jgreavu said:
sorry I have no idea how to typeset equations, I am quite young and new to this but I don't want that to instantly discredit me in any way

I'm not lol, trust me on that it helps to work from a posted article I still struggle with the latex

here is how to post latex
https://www.physicsforums.com/showpost.php?p=3977517&postcount=3

by the way I'm not about to buy into Nature to see the full articles. let's look for an equivalent on arxiv its free
 
  • #13
Simon thank you. Can you expand on this anymore? I find this supremely fascinating.
 
  • #14
jgreavu said:
Here is my theory:

Welcome to PF!

We do not discuss or critique personal theories here. For more details, click the "Site Info" link at the top of any page here, and choose "Rules & Guidelines."
 
  • #15
Just a reminder to everyone, when a post doesn't meet the rules, don't respond, report it, OK? You all know better than to do this. This thread should have ended at the first post.
 

Related to Our attempts at certainty (measurements) actually increase uncertainty

What is the meaning of "Our attempts at certainty (measurements) actually increase uncertainty"?

This statement refers to the fact that when we try to measure or quantify something, we introduce a level of uncertainty into the process. This is because there are inherent limitations and errors in our measurement tools and techniques, which can affect the accuracy and precision of our results.

Why do our attempts at certainty result in increased uncertainty?

This is because measurement involves taking a sample or approximation of something that is not entirely quantifiable. As a result, there will always be some degree of uncertainty associated with the measurement, as it is impossible to capture the entirety of the phenomenon being measured.

How does uncertainty in measurements affect scientific research?

Uncertainty in measurements can have a significant impact on scientific research, as it can affect the accuracy and reliability of the results. If the uncertainty is not properly accounted for, it can lead to incorrect conclusions and potentially invalid or unreliable findings.

What are some factors that contribute to uncertainty in measurements?

There are several factors that can contribute to uncertainty in measurements, including limitations of measurement tools and techniques, human error, and natural variability in the phenomenon being measured. Additionally, external factors such as environmental conditions can also impact the accuracy of measurements.

How can scientists account for uncertainty in their measurements?

Scientists can account for uncertainty in their measurements by using statistical analysis and error calculations, as well as conducting multiple measurements and averaging the results. It is also important for scientists to clearly communicate the level of uncertainty associated with their measurements in order to accurately interpret and draw conclusions from their data.

Similar threads

Replies
1
Views
1K
  • Cosmology
Replies
15
Views
1K
  • Special and General Relativity
Replies
7
Views
331
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Cosmology
Replies
1
Views
1K
Replies
15
Views
1K
Replies
4
Views
1K
  • Cosmology
Replies
24
Views
2K
Replies
5
Views
1K
Back
Top