Does Entropy Mean Less Information or More Complexity?

Click For Summary
SUMMARY

The discussion centers on the relationship between entropy, information, and complexity, particularly in the context of thermodynamics. Participants argue that as entropy increases, systems transition from ordered states, which possess structured information, to disordered states with less discernible information. A glass, representing an ordered state, contains specific information about its structure, while a broken glass, despite its chaotic appearance, may hold more complex information due to its fragmented nature. The conversation highlights the distinction between the information required to describe an object and the inherent information contained within the object itself.

PREREQUISITES
  • Basic understanding of thermodynamics concepts, particularly entropy.
  • Familiarity with the distinction between data and information.
  • Knowledge of probability theory as it relates to states of order and disorder.
  • Conceptual grasp of information theory and its application to physical systems.
NEXT STEPS
  • Research the Second Law of Thermodynamics and its implications for entropy.
  • Explore information theory, focusing on the definitions of data versus information.
  • Study the relationship between entropy and probability in physical systems.
  • Investigate examples of complex systems to understand how information is structured within them.
USEFUL FOR

Students and professionals in physics, thermodynamics, and information theory, as well as anyone interested in the philosophical implications of entropy and information in physical systems.

dhruv.tara
Messages
45
Reaction score
0

Homework Statement


I haven't studied thermodynamics and we have just a very brief use of thermo in my power plants course.

I have read that entropy always increases, universe tends to disorderly state from an orderly state. The teacher related this to probability that nature tends to go from less probable state to more probable state.

All these statements were fine.

Then they related this to the information contained within the system. I was told that nature tends to go from more information states to less information states. Things tend to lose information.

E.g. A glass -> Has information associated with it, its structure and other properties.
A broken glass -> Completely random. Not much information associated with it.

I cannot convince myself of the above. Can't we say the opposite?
A glass had much lesser information associated with it or most of the properties were redundant that given a few parameters it was possible to give a complete description of the object.
Broken glass-> Has too much information. We just can't classify or find parameters (or need way too many parameters) to describe a broken glass. (Rather than saying that its just random)

My guess is that it can have something to do with the difference between data and information, but that's only a guess...

Thanks... Any help is appreciated.
 
Physics news on Phys.org
I think you might be mixing up the "information" needed to describe an object with the information the object itself does possess. Take some more time to think about the glass and the broken chards of glass and which has more structured information.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K