- #1

- 13

- 0

The term “Entropy” shows up both in thermodynamics and information theory.

Now my question is :

What’s the relationship between entropy in the information-theory sense and the thermodynamics sense?

I need some clear and short guidance.

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter Jam Smith
- Start date

- #1

- 13

- 0

The term “Entropy” shows up both in thermodynamics and information theory.

Now my question is :

What’s the relationship between entropy in the information-theory sense and the thermodynamics sense?

I need some clear and short guidance.

- #2

DrClaude

Mentor

- 7,607

- 4,009

Start with Wikipedia: https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

- #3

- 13

- 0

Start with Wikipedia: https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

I can’t think of a best way to signify intuitively that entropy and information theory are essentially the same. Imperatively I found they both are hard to describe.

The amount of information it takes to describe something is proportional to its entropy. Once you have the equations (“I = log2(N)” and “E = k log(N)”) this is pretty obvious. However, the way the word “entropy” is used in common speech is a little misleading.

Right?

- #4

- 35,977

- 4,679

owever, the way the word “entropy” is used in common speech is a little misleading.

Right?

The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.

- #5

- 13

- 0

The way the word [insert any physics terminology here] is used in common speech is a little misleading. This should not be surprising and it is very common.

However, you did ask this in a physics forum, and we seldom answer such questions using such misleading/mistaken concepts. What you will get is the standard physics answer. Isn't this what you were looking for?

Zz.

You misunderstood me. I was talking about,

In thermodynamics every state is as likely to come up as any other. In information theory,

Why in thermodynamics every state is as likely to come up as any other? Can this be proved?

I am still not clear.

- #6

DrClaude

Mentor

- 7,607

- 4,009

It's a fundamental assumption of statistical physics. It can be proven, but it appears to be valid (no experimental evidence has shown otherwise).Why in thermodynamics every state is as likely to come up as any other? Can this be proved?

We had a discussion about his in a recent thread: https://www.physicsforums.com/threads/boltzmann-with-degenerate-levels.902321/

- #7

- 13

- 0

It's a fundamental assumption of statistical physics. It can be proven, but it appears to be valid (no experimental evidence has shown otherwise).

We had a discussion about his in a recent thread: https://www.phyhttps://www.physicsforums.com/threads/the-term-entropy.904205/sicsforums.com/threads/boltzmann-with-degenerate-levels.902321/

Thanks for providing such fundamental information. I have go through both links you provided and it helps me to clear my doubts.

Share: