Prove that information is either increasing or remaining constant with time.

Click For Summary

Discussion Overview

The discussion revolves around the question of whether information is either increasing or remaining constant over time. Participants explore this concept in various contexts, including physics, entropy, and information theory, while considering definitions and implications of information in relation to physical laws.

Discussion Character

  • Debate/contested
  • Conceptual clarification
  • Technical explanation

Main Points Raised

  • Some participants suggest that information might be defined in terms of entropy, with the idea that as entropy increases, information decreases or remains constant for isolated systems.
  • Others propose that all physical systems inherently register and process information, and that the universe's computational capacity can be quantified.
  • A participant argues that while new information appears to be created, it may simply be a transformation of existing information, leading to the conclusion that total information remains constant.
  • There is a discussion about the definitions of information, with some participants expressing frustration over varying definitions across different scientific fields.
  • Some contributions reference the black hole information paradox, indicating that there is no consensus on whether information can be lost in black holes, with differing views on whether it can decrease locally.
  • A participant mentions that the positive definite character of certain functions does not definitively prove whether information increases, decreases, or remains constant.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the nature of information or its behavior over time. Multiple competing views and definitions are presented, and the discussion remains unresolved.

Contextual Notes

Participants highlight the importance of defining information, noting that it is not universally agreed upon as a physical property. The discussion also reflects varying interpretations of the relationship between information and entropy, as well as the implications of physical laws on information theory.

manishbit
Messages
4
Reaction score
0
Hello,
Can you Prove that information is either increasing or remaining constant with time.
Thanks.
 
Physics news on Phys.org
In what context do you mean this?

What do YOU think?
 
I'm not sure about PROVE...but
You might find Charles Seife's DECODING THE UNIVERSE an interesting read.
here are a few insights from pages 71-87:

The central idea in Shannon's information theory is entropy...entropy is a measure of information...in a sense thermodynamics is just a special case of information theory...Shannon's measure of randomness is precisely the same function as Boltzmann entropy...Nature it seems, attempts to dissipate stored information just as it attempts to increase entropy;the two ideas are exactly the same...

For a view of bits versus entropy, see Landauer's principle
 
1) Information as in physics.( Using which you can construct a just-destroyed building again.)
2) As Entropy is a measure of information , can we say "as entropy increases, information too increase". ? If it is so, then information must be governed by laws of the universe, thus information in our universe may behave differently than information in other universe?
 
manishbit said:
Hello,
Can you Prove that information is either increasing or remaining constant with time.
Thanks.

First you would define information, which is not a physical property.

Some guys define information as minus entropy. Then, using the second law, it is proved that information decreases or remains constant for isolated systems.
 
Maybe you've already read this but I thought I'd post it:

Merely by existing, all physical systems register information. And by evolving dynamically in time, they transform and process that information. The laws of physics determine the amount of information that a physical system can register (number of bits) and the number of elementary logic operations that a system can perform (number of ops). The universe is a physical system. This paper quantifies the amount of information that the universe can register and the number of elementary operations that it can have performed over its history. The universe can have performed no more than 10120 ops on 1090bits

Computational capacity of the universe
http://arxiv.org/PS_cache/quant-ph/pdf/0110/0110141v1.pdf
 
juanrga said:
First you would define information, which is not a physical property.

Some guys define information as minus entropy. Then, using the second law, it is proved that information decreases or remains constant for isolated systems.
Information is not negative entropy , its like H(x) = E(I(x)).
H, E, I are entropy, Expected value, Information respectively of a random variable x.

Btw, it can never decrease. Its a known physical fact. (Even Steven Hawking accepted the mistake when once he said it is destroyed in black holes)
 
manishbit said:
juanrga said:
First you would define information, which is not a physical property.

Some guys define information as minus entropy. Then, using the second law, it is proved that information decreases or remains constant for isolated systems.

Information is not negative entropy , its like H(x) = E(I(x)).
H, E, I are entropy, Expected value, Information respectively of a random variable x.

Btw, it can never decrease. Its a known physical fact. (Even Steven Hawking accepted the mistake when once he said it is destroyed in black holes)

As said it depends how you define information, because information is not a physical property. A very common definition
Recall that "information = minus entropy"
http://webdocs.cs.uAlberta.ca/~aixplore/learning/DecisionTrees/InterArticle/4-DecisionTree.html

but, of course, there are many more definitions.

For the above common definition, information decreases when entropy increases. This is a well-known fact.

Hawking notorious mistake about information in black holes was not referred to the definition of information that he used (which decreases when entropy increases), but to his silly attempt to obtain a non-unitary evolution from the unitary laws of theories as GR and QM. Indeed, in rigor his definition (as yours) would remain constant under the unitary laws of QM and GR.

You continue without giving us a definition of information, but from your expressions I assume that you mean something as I(x)=-log p(x). This definition of information increases or remains constant because H≥0.
 
Last edited:
manishbit said:
Hello,
Can you Prove that information is either increasing or remaining constant with time.
Thanks.

What I understand by 'amount of information in the universe' is everything that had happened since the conception of the universe and everything currently exist. Which can be summed up as 'what currently exist'.
Information can not 'die', but it can transform into other information. One can argue that new information is continuously created with time proving information is 'increasing'.
But if you look deeper, the systems (or objects) that create new information had the POTENTIAL to create new information BURIED in them. In other words, it is actually transformation of one information into another.
My conclusion? Total Information in the universe remain the SAME.


Thanks [Bohm2] for the link. It's an interesting read. I'd be happy to argue with anyone who thinks the Universe is NOT a computer.
Computational capacity of the universe
http://arxiv.org/PS_cache/quant-ph/p.../0110141v1.pdf
 
Last edited by a moderator:
  • #10
Now I am wondering why people are asking me to define information.
I assume they might have seen different definitions in different fields. But are they really different! How can "information" be different from "information" just because its a part of different subject (scientific).
Science should be able to define information in one precise way, so that even an 8 year old can understand in a crystal clear way.

So, Can somebody define information for me from scientific point of view ?

Thanks.

P.S. : I hate when people say - "forces" in chemistry is different than forces" in physics u know.
 
  • #11
Stephen Hawking and Kip Thorne had a bet going with John Preskill regarding whether or not information could be lost from the universe forever as it enters a black hole. Eventually, Stephen Hawking threw in the towel on "the black hole information paradox." Kip Thorne remains unconvinced that it has been proven that a black hole does not permanently remove information from the universe.

So, to address your question, I don't think that anyone has mathematically proven (beyond a shadow of a doubt) that information is not decreasing (even locally, i.e. in the vicinity of a black hole)... or as you phrased it... that "information is either increasing or remaining the same."
 
  • #12
juanrga said:
You continue without giving us a definition of information, but from your expressions I assume that you mean something as I(x)=-log p(x). This definition of information increases or remains constant because H≥0.

My mistake. H≥0 does not prove anything about information I(x) as defined above.

I have been thinking about this and it can be proved that the positive definite character of the H function is compatible with I(x) increasing, decreasing, or remaining constant.
 
  • #13
manishbit said:
Now I am wondering why people are asking me to define information.
I assume they might have seen different definitions in different fields. But are they really different! How can "information" be different from "information" just because its a part of different subject (scientific).
Science should be able to define information in one precise way, so that even an 8 year old can understand in a crystal clear way.

So, Can somebody define information for me from scientific point of view ?

Thanks.

P.S. : I hate when people say - "forces" in chemistry is different than forces" in physics u know.

As has been said to you information is not a physical property.

In the International system of units you can find units for physical properties as energy, mass, length, time, electric charge, heat flux density, force, temperature, chemical composition and so on. But there is not SI unit for «information» because there is not such physical property.

Information is a human concept. You can believe that something is information and other can believe the contrary.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
5K
Replies
3
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
5K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K