Entropy: Thermodynamics & Information Theory Explained

Click For Summary

Discussion Overview

The discussion revolves around the relationship between entropy in thermodynamics and information theory. Participants explore the conceptual similarities and differences between these two contexts, seeking clarity on how entropy is defined and understood in each field.

Discussion Character

  • Exploratory
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants note that entropy appears in both thermodynamics and information theory, prompting questions about their relationship.
  • One participant suggests that the amount of information needed to describe a system is proportional to its entropy, referencing equations from both fields.
  • Another participant expresses concern that the common usage of the term "entropy" can be misleading, which is a common issue with scientific terminology.
  • A participant points out that in thermodynamics, every state is assumed to be equally likely, which is a fundamental assumption of statistical physics, and questions whether this can be proven.
  • It is mentioned that while the assumption of equal likelihood appears valid, there is no experimental evidence contradicting it.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the relationship between entropy in the two contexts, and there are competing views regarding the implications of the assumptions made in thermodynamics.

Contextual Notes

The discussion includes references to external links for further reading, indicating that participants are drawing on previous discussions and resources to inform their understanding.

Jam Smith
Messages
13
Reaction score
0
I was reading some articles related to entropy and I come to know that,
The term “Entropy” shows up both in thermodynamics and information theory.

Now my question is :
What’s the relationship between entropy in the information-theory sense and the thermodynamics sense?

I need some clear and short guidance.
 
Science news on Phys.org
DrClaude said:

I can’t think of a best way to signify intuitively that entropy and information theory are essentially the same. Imperatively I found they both are hard to describe.

The amount of information it takes to describe something is proportional to its entropy. Once you have the equations (“I = log2(N)” and “E = k log(N)”) this is pretty obvious. However, the way the word “entropy” is used in common speech is a little misleading.

Right?
 
Jam Smith said:
owever, the way the word “entropy” is used in common speech is a little misleading.
Right?

The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.
 
  • Like
Likes   Reactions: DrClaude
ZapperZ said:
The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.

You misunderstood me. I was talking about,
In thermodynamics every state is as likely to come up as any other. In information theory,

Why in thermodynamics every state is as likely to come up as any other? Can this be proved?
I am still not clear.
 
  • Like
Likes   Reactions: Jam Smith

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 100 ·
4
Replies
100
Views
9K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K