Prove that information is either increasing or remaining constant with time.

In summary: First you would define information, which is not a physical property.Some guys define information as minus entropy. Then, using the second law, it is proved that information decreases or remains constant for isolated systems.Information is not negative entropy , its like H(x) = E(I(x)). H, E, I are entropy, Expected value, Information respectively of a random variable x.Btw, it can never decrease. Its a known physical fact. (Even Steven Hawking accepted the mistake when once he said it is destroyed in black holes)First you would define information, which is not a physical property.Some guys define information as minus entropy. Then, using the second
  • #1
manishbit
4
0
Hello,
Can you Prove that information is either increasing or remaining constant with time.
Thanks.
 
Physics news on Phys.org
  • #2
In what context do you mean this?

What do YOU think?
 
  • #3
I'm not sure about PROVE...but
You might find Charles Seife's DECODING THE UNIVERSE an interesting read.
here are a few insights from pages 71-87:

The central idea in Shannon's information theory is entropy...entropy is a measure of information...in a sense thermodynamics is just a special case of information theory...Shannon's measure of randomness is precisely the same function as Boltzmann entropy...Nature it seems, attempts to dissipate stored information just as it attempts to increase entropy;the two ideas are exactly the same...

For a view of bits versus entropy, see Landauer's principle
 
  • #4
1) Information as in physics.( Using which you can construct a just-destroyed building again.)
2) As Entropy is a measure of information , can we say "as entropy increases, information too increase". ? If it is so, then information must be governed by laws of the universe, thus information in our universe may behave differently than information in other universe?
 
  • #5
manishbit said:
Hello,
Can you Prove that information is either increasing or remaining constant with time.
Thanks.

First you would define information, which is not a physical property.

Some guys define information as minus entropy. Then, using the second law, it is proved that information decreases or remains constant for isolated systems.
 
  • #6
Maybe you've already read this but I thought I'd post it:

Merely by existing, all physical systems register information. And by evolving dynamically in time, they transform and process that information. The laws of physics determine the amount of information that a physical system can register (number of bits) and the number of elementary logic operations that a system can perform (number of ops). The universe is a physical system. This paper quantifies the amount of information that the universe can register and the number of elementary operations that it can have performed over its history. The universe can have performed no more than 10120 ops on 1090bits

Computational capacity of the universe
http://arxiv.org/PS_cache/quant-ph/pdf/0110/0110141v1.pdf
 
  • #7
juanrga said:
First you would define information, which is not a physical property.

Some guys define information as minus entropy. Then, using the second law, it is proved that information decreases or remains constant for isolated systems.
Information is not negative entropy , its like H(x) = E(I(x)).
H, E, I are entropy, Expected value, Information respectively of a random variable x.

Btw, it can never decrease. Its a known physical fact. (Even Steven Hawking accepted the mistake when once he said it is destroyed in black holes)
 
  • #8
manishbit said:
juanrga said:
First you would define information, which is not a physical property.

Some guys define information as minus entropy. Then, using the second law, it is proved that information decreases or remains constant for isolated systems.

Information is not negative entropy , its like H(x) = E(I(x)).
H, E, I are entropy, Expected value, Information respectively of a random variable x.

Btw, it can never decrease. Its a known physical fact. (Even Steven Hawking accepted the mistake when once he said it is destroyed in black holes)

As said it depends how you define information, because information is not a physical property. A very common definition
Recall that "information = minus entropy"
http://webdocs.cs.uAlberta.ca/~aixplore/learning/DecisionTrees/InterArticle/4-DecisionTree.html

but, of course, there are many more definitions.

For the above common definition, information decreases when entropy increases. This is a well-known fact.

Hawking notorious mistake about information in black holes was not referred to the definition of information that he used (which decreases when entropy increases), but to his silly attempt to obtain a non-unitary evolution from the unitary laws of theories as GR and QM. Indeed, in rigor his definition (as yours) would remain constant under the unitary laws of QM and GR.

You continue without giving us a definition of information, but from your expressions I assume that you mean something as I(x)=-log p(x). This definition of information increases or remains constant because H≥0.
 
Last edited:
  • #9
manishbit said:
Hello,
Can you Prove that information is either increasing or remaining constant with time.
Thanks.

What I understand by 'amount of information in the universe' is everything that had happened since the conception of the universe and everything currently exist. Which can be summed up as 'what currently exist'.
Information can not 'die', but it can transform into other information. One can argue that new information is continuously created with time proving information is 'increasing'.
But if you look deeper, the systems (or objects) that create new information had the POTENTIAL to create new information BURIED in them. In other words, it is actually transformation of one information into another.
My conclusion? Total Information in the universe remain the SAME.


Thanks [Bohm2] for the link. It's an interesting read. I'd be happy to argue with anyone who thinks the Universe is NOT a computer.
Computational capacity of the universe
http://arxiv.org/PS_cache/quant-ph/p.../0110141v1.pdf
 
Last edited by a moderator:
  • #10
Now I am wondering why people are asking me to define information.
I assume they might have seen different definitions in different fields. But are they really different! How can "information" be different from "information" just because its a part of different subject (scientific).
Science should be able to define information in one precise way, so that even an 8 year old can understand in a crystal clear way.

So, Can somebody define information for me from scientific point of view ?

Thanks.

P.S. : I hate when people say - "forces" in chemistry is different than forces" in physics u know.
 
  • #11
Stephen Hawking and Kip Thorne had a bet going with John Preskill regarding whether or not information could be lost from the universe forever as it enters a black hole. Eventually, Stephen Hawking threw in the towel on "the black hole information paradox." Kip Thorne remains unconvinced that it has been proven that a black hole does not permanently remove information from the universe.

So, to address your question, I don't think that anyone has mathematically proven (beyond a shadow of a doubt) that information is not decreasing (even locally, i.e. in the vicinity of a black hole)... or as you phrased it... that "information is either increasing or remaining the same."
 
  • #12
juanrga said:
You continue without giving us a definition of information, but from your expressions I assume that you mean something as I(x)=-log p(x). This definition of information increases or remains constant because H≥0.

My mistake. H≥0 does not prove anything about information I(x) as defined above.

I have been thinking about this and it can be proved that the positive definite character of the H function is compatible with I(x) increasing, decreasing, or remaining constant.
 
  • #13
manishbit said:
Now I am wondering why people are asking me to define information.
I assume they might have seen different definitions in different fields. But are they really different! How can "information" be different from "information" just because its a part of different subject (scientific).
Science should be able to define information in one precise way, so that even an 8 year old can understand in a crystal clear way.

So, Can somebody define information for me from scientific point of view ?

Thanks.

P.S. : I hate when people say - "forces" in chemistry is different than forces" in physics u know.

As has been said to you information is not a physical property.

In the International system of units you can find units for physical properties as energy, mass, length, time, electric charge, heat flux density, force, temperature, chemical composition and so on. But there is not SI unit for «information» because there is not such physical property.

Information is a human concept. You can believe that something is information and other can believe the contrary.
 

1. What is the definition of information in this context?

In this context, information refers to any data, knowledge, or facts that can be communicated or received by an individual or system.

2. How can we measure the increase or constancy of information over time?

One way to measure the increase or constancy of information is by looking at the amount of data that is being generated and collected over time. Another way is to analyze the complexity and diversity of information being produced.

3. What factors contribute to the increase of information over time?

The increase of information over time can be attributed to advancements in technology, the growth of human knowledge and understanding, and the accumulation of historical data.

4. How does the rate of information increase vary across different fields or industries?

The rate of information increase can vary greatly across different fields or industries, depending on the amount of research and development being conducted in each field, the availability of technology, and the demand for new information in that particular industry.

5. Can information ever decrease or remain constant over time?

In rare cases, information can decrease over time due to data loss or destruction, but overall, the trend is for information to either increase or remain constant as new information is constantly being generated and added to existing knowledge.

Similar threads

  • Quantum Physics
2
Replies
39
Views
2K
Replies
0
Views
255
  • Calculus and Beyond Homework Help
Replies
11
Views
185
Replies
1
Views
606
Replies
80
Views
2K
Replies
6
Views
999
  • Quantum Physics
Replies
5
Views
877
Replies
18
Views
1K
Replies
12
Views
429
Replies
1
Views
697
Back
Top