Entropy: Thermodynamics & Information Theory Explained

Click For Summary
SUMMARY

The discussion clarifies the relationship between entropy in thermodynamics and information theory, emphasizing that both concepts are fundamentally linked. The equations I = log2(N) for information and E = k log(N) for thermodynamics illustrate that the amount of information required to describe a system is proportional to its entropy. The conversation also highlights the common misconceptions surrounding the term "entropy" in everyday language, which can lead to confusion. The principle that every state in thermodynamics is equally likely is a fundamental assumption of statistical physics, supported by no experimental evidence to the contrary.

PREREQUISITES
  • Understanding of basic thermodynamics principles
  • Familiarity with information theory concepts
  • Knowledge of statistical physics fundamentals
  • Ability to interpret mathematical equations related to entropy
NEXT STEPS
  • Study the derivation of the Boltzmann entropy formula
  • Explore the implications of statistical mechanics on thermodynamic systems
  • Learn about the applications of entropy in information theory
  • Investigate common misconceptions about entropy in popular science literature
USEFUL FOR

Students and professionals in physics, particularly those interested in thermodynamics and information theory, as well as educators seeking to clarify the concept of entropy for their audiences.

Jam Smith
Messages
13
Reaction score
0
I was reading some articles related to entropy and I come to know that,
The term “Entropy” shows up both in thermodynamics and information theory.

Now my question is :
What’s the relationship between entropy in the information-theory sense and the thermodynamics sense?

I need some clear and short guidance.
 
Science news on Phys.org
DrClaude said:

I can’t think of a best way to signify intuitively that entropy and information theory are essentially the same. Imperatively I found they both are hard to describe.

The amount of information it takes to describe something is proportional to its entropy. Once you have the equations (“I = log2(N)” and “E = k log(N)”) this is pretty obvious. However, the way the word “entropy” is used in common speech is a little misleading.

Right?
 
Jam Smith said:
owever, the way the word “entropy” is used in common speech is a little misleading.
Right?

The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.
 
  • Like
Likes   Reactions: DrClaude
ZapperZ said:
The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.

You misunderstood me. I was talking about,
In thermodynamics every state is as likely to come up as any other. In information theory,

Why in thermodynamics every state is as likely to come up as any other? Can this be proved?
I am still not clear.
 
  • Like
Likes   Reactions: Jam Smith

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
Replies
10
Views
3K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 100 ·
4
Replies
100
Views
8K
  • · Replies 2 ·
Replies
2
Views
10K
  • · Replies 9 ·
Replies
9
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K