Entropy: Thermodynamics & Information Theory Explained

AI Thread Summary
Entropy is a concept that appears in both thermodynamics and information theory, with a relationship between the two rooted in the idea that the amount of information needed to describe a system is proportional to its entropy. The equations I = log2(N) for information theory and E = k log(N) for thermodynamics illustrate this connection. In thermodynamics, it is assumed that every state of a system is equally likely, a fundamental principle of statistical physics that has not been disproven experimentally. Misinterpretations of "entropy" in everyday language can lead to confusion, but discussions in physics forums aim to clarify these concepts with standard physics explanations. Understanding these principles enhances comprehension of entropy's role in both fields.
Jam Smith
Messages
13
Reaction score
0
I was reading some articles related to entropy and I come to know that,
The term “Entropy” shows up both in thermodynamics and information theory.

Now my question is :
What’s the relationship between entropy in the information-theory sense and the thermodynamics sense?

I need some clear and short guidance.
 
Science news on Phys.org
DrClaude said:

I can’t think of a best way to signify intuitively that entropy and information theory are essentially the same. Imperatively I found they both are hard to describe.

The amount of information it takes to describe something is proportional to its entropy. Once you have the equations (“I = log2(N)” and “E = k log(N)”) this is pretty obvious. However, the way the word “entropy” is used in common speech is a little misleading.

Right?
 
Jam Smith said:
owever, the way the word “entropy” is used in common speech is a little misleading.
Right?

The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.
 
  • Like
Likes DrClaude
ZapperZ said:
The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.

You misunderstood me. I was talking about,
In thermodynamics every state is as likely to come up as any other. In information theory,

Why in thermodynamics every state is as likely to come up as any other? Can this be proved?
I am still not clear.
 
  • Like
Likes Jam Smith
Back
Top