Here is my theory: - The measurement of an observable by an observer is a thermodynamically irreversible process. All thermodynamically irreversible processes increase total system entropy. - Landauer's principle (which was experimentally shown in 2012) says that the minimum amount of energy required to add/delete/alter 1 bit of information is ≥ k*T*ln(2), where k is Boltzmann's and T is temperature. - The natural log of a number is equal to the natural log of 2 times the binary logarithm of that number. Since binary logarithms (with the argument being the number of possible states) are how we measure bits of information, we can say that there is a relationship between information and entropy. - When I say information I mean the number of bits still needed to completely describe a system, so from here on out I'll just start calling it uncertainty. I'll suggest the following relationship: U ≈ S / k ln(2) where U is the uncertainty: the number of bits still needed for the observer to fully describe the observable system and S is entropy. This means that all increases in entropy increase uncertainty. Therefore in our attempts to gain certainty about the world, taking measurements, we are actually increasing our uncertainty about it. Furthermore, we are increasing our entropy and possibly bringing us closer to the heat death (big freeze really) of the universe with each measurement. Increases in entropy occur with expansion or temperature decreases in the universe. I do remember reading something about how measuring dark energy density or something actually increased the accelerating expansion of the universe. In the context of many-worlds, perhaps each measurement spawns new universes and this is responsible for the increase in entropy/uncertainty? I'm looking for holes and constructive criticism. I'd really appreciate any comments and questions.