Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Our attempts at certainty (measurements) actually increase uncertainty

  1. Apr 27, 2014 #1
    Here is my theory:

    - The measurement of an observable by an observer is a thermodynamically irreversible process. All thermodynamically irreversible processes increase total system entropy.

    - Landauer's principle (which was experimentally shown in 2012) says that the minimum amount of energy required to add/delete/alter 1 bit of information is ≥ k*T*ln(2), where k is Boltzmann's and T is temperature.

    - The natural log of a number is equal to the natural log of 2 times the binary logarithm of that number. Since binary logarithms (with the argument being the number of possible states) are how we measure bits of information, we can say that there is a relationship between information and entropy.

    - When I say information I mean the number of bits still needed to completely describe a system, so from here on out I'll just start calling it uncertainty. I'll suggest the following relationship:

    U ≈ S / k ln(2)

    where U is the uncertainty: the number of bits still needed for the observer to fully describe the observable system and S is entropy.

    This means that all increases in entropy increase uncertainty. Therefore in our attempts to gain certainty about the world, taking measurements, we are actually increasing our uncertainty about it.

    Furthermore, we are increasing our entropy and possibly bringing us closer to the heat death (big freeze really) of the universe with each measurement. Increases in entropy occur with expansion or temperature decreases in the universe. I do remember reading something about how measuring dark energy density or something actually increased the accelerating expansion of the universe.

    In the context of many-worlds, perhaps each measurement spawns new universes and this is responsible for the increase in entropy/uncertainty?

    I'm looking for holes and constructive criticism. I'd really appreciate any comments and questions.
     
    Last edited: Apr 27, 2014
  2. jcsd
  3. Apr 27, 2014 #2
    first off welcome to the forum, secondly the forum does not support personal theories. That being said. The many worlds interpretation is just that an interpretation. One that remains unsupported with any form of observational evidence, other than those dealing with quantum mechanics. Entropy does not necessarily imply disorder, Entropy is the number of degrees of freedom a system has. Its usage in thermodynamics is accountable. Count the number of particle species, including their anti-particle. Tally up the number of degrees of freedom due to spin of each particle species. This will give you the entropy density. Take photons for example as is is own anti-particle, with spin 1 this gives an entropy of 2. Their distribution is describable by the Bose-Einstein distribution, along with other bosons. The Fermi-Dirac distribution is for fermions.

    As you can see Entropy in thermodynamics has a specific use. It is also used in a similar manner in Landauer theory. . Rolf Landauer recognized that it is the logically irreversible erasure of information, stored physically in the information-bearing degrees of freedom of a memory register or computer, that necessitates a corresponding entropy increase in the environment or in the non-information-bearing degrees of freedom. This is different than what you described.

    At inverse temperature , the entropy increase causes heat β

    [tex]\Delta Q≥\frac{\Delta S}{\beta}[/tex] where [tex]\Delta S[/tex] is change of entropy

    might help if you truly look at both Landauer and manyworld metrics and fully understand what they are saying before developing a personal theory based on them.

    I might also add Landauer theory is highly controversial so is many worlds interpretation as it applies to multi-verse theories

    for example the above is based on the first article I pulled up on it, Some of the controversies are mentioned in it.
    http://arxiv.org/pdf/1306.4352v2.pdf
     
    Last edited: Apr 27, 2014
  4. Apr 27, 2014 #3
    Thanks for the welcome. Sorry I didn't know the rule but that's not an excuse either.

    What about BICEP2's recent discovery? Doesn't that offer a bit of support? Or the mathematical properties of eigenvalues? And that many-worlds requires less assumptions than Copenhagen? Not to mention many credible theorists believe it.

    Physics will go nowhere if controversial ideas are immediately shot down.

    I never said a thing about disorder. Disorder is a weird word. I said uncertainty. The number of degrees of freedom is the number of possible states. The more states a system has, the more bits required to describe it.

    You must also know that that property of logarithms is true and information in bits is described by binary logs.
     
  5. Apr 27, 2014 #4
    We actually don't shoot down theories, that have no evidence of being wrong. Any theory until proven wrong is considered valid. That's the scientific method, however the forum only allows discussion of theories with supportive and peer reviewed articles. This is to prevent ppl from coming here and throwing any old theory even if not scientifically accurate at us. Discussion of a peer reviewed paper is always allowed.
     
  6. Apr 27, 2014 #5
    here,

    Landauer's principle: dQ ≥ kTln2

    S (for irreversible) is proportional to dQ/T

    ln(x) = ln(2)logbase2(x)

    and S(x) is also proportional to the natural log of the number of states or degrees of freedom

    and U(x) = uncertainty, or bits still needed to describe system = logbase(number of possible states)

    Therefore,

    S(x) is proportional to kln(x) is proportional to ln(2)U(x)

    U(x) is proportional to S(x)/kln2

    Prove my math wrong.

    And read the article (published in Nature) from 2012 which confirmed Landauer's principle.

    Sorry to be brash I just am frustrated that no one will hear me out on this and the only response I've heard basically is "that's controversial".
     
  7. Apr 27, 2014 #6
    Last edited by a moderator: May 6, 2017
  8. Apr 27, 2014 #7
    sorry I have no idea how to typeset equations, I am quite young and new to this but I don't want that to instantly discredit me in any way
     
  9. Apr 27, 2014 #8
    post the peer reviewed paper and I'll be happy to read it. I read numerous article even ones I don't necessarily agree with. That reading has taught me that straight mathematics is capable of any interpretation. Doesn't necessarily mean a theory is correct. If you wish to pursue your theory all the power to you, however you need to look at the full metrics involved. Just a suggestion. For example your not using the Von Neumann of a quantum state
    ρ
     
  10. Apr 27, 2014 #9
    I know I am not... that's why I carefully said proportional to, to save myself from more complicated, but unnecessary mathematics for a sufficient argument. I posted the paper above.
     
  11. Apr 27, 2014 #10
    which is still S = -tr(rho ln rho) where rho is the density matrix. and the resulting entropies are positive so to that tells you that it is still proportional to the log of probability of states
     
  12. Apr 27, 2014 #11

    Simon Bridge

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    More broadly:
    The increase in uncertainty due to taking super accurate measurements is a common effect encountered in the study of inverse problems. If the noise in the inverse problem is very small, then taking the inverse process to reconstruct the forward problem results in the noise being large. It's just what happens when you invert matrices.

    It happens in everything - the effect is purely statistical: it need not cause a physical variation that would be seen Nature but it would be no surprise to see the effect show up as thermodynamics (a model based in statistics) in some situations.

    It is certainly not a surprise to link "information" and "thermodynamics".
     
  13. Apr 27, 2014 #12
    I'm not lol, trust me on that it helps to work from a posted article I still struggle with the latex

    here is how to post latex
    https://www.physicsforums.com/showpost.php?p=3977517&postcount=3

    by the way I'm not about to buy into Nature to see the full articles. lets look for an equivalent on arxiv its free
     
  14. Apr 27, 2014 #13
    Simon thank you. Can you expand on this anymore? I find this supremely fascinating.
     
  15. Apr 27, 2014 #14

    jtbell

    User Avatar

    Staff: Mentor

    Welcome to PF!

    We do not discuss or critique personal theories here. For more details, click the "Site Info" link at the top of any page here, and choose "Rules & Guidelines."
     
  16. Apr 27, 2014 #15

    Evo

    User Avatar

    Staff: Mentor

    Just a reminder to everyone, when a post doesn't meet the rules, don't respond, report it, OK? You all know better than to do this. This thread should have ended at the first post.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Our attempts at certainty (measurements) actually increase uncertainty
  1. Actual size of all space (Replies: 26)

Loading...