# How to visualize entropy in thermodynamics

• vijay123
In summary, the concept of entropy is the measure of disorder in a system and is represented by the formula S = q/t. However, understanding it practically can be challenging, especially when trying to solve calculus-based entropy problems. One way to approach it is by using the formula S = (boltzmann's constant)lnw, where w is the number of macrostates in the system. This can also be used to find the number of macrostates by equating it with the formula q/t, although this may result in a large number. The thermodynamic definition of entropy, dS = dQ/T, can also be used to better understand this concept. Additionally, the definition of temperature, 1/(boltzmann's constant
vijay123
entropy is the measure of disorder...but i just cannot visualize it in a practical standpoint...i mean...the disorder taking place with soo many particles when heated and yet all i know is that s=q/t...can anyone explain this concept better to me?

one more question...how does one prove using statistical mechanics that
s=(boltzmann's constant)lnw...were w is the number of macrostates...

i was tryng to do calculus based entropy probelms...but can do them in refrence to s=(boltzmann's constant)lnw, by counting the number of macrostates available??

and is it possible to find th number of macrostates by equating (boltzmann's constant)lnw=q/t...is this a very large number?

vijay123 said:
one more question...how does one prove using statistical mechanics that
s=(boltzmann's constant)lnw...were w is the number of macrostates...
Entropy of a composite system is given by:
S* = SA + SA'-----(1)
Number of accessible states to A* is:
W* = WAWA'------(2)

S*,SA, SA' are entropies of composite system A*, A and A' respectively(A* composed of A and A').
The entropy is a "state" function.
For (1) & (2) to hold good simultaneously, what should be the nature of this function?

W in S=klnW is the number of microstates.

I was indtroduced to entropy through S=klnW as the definition, therefore it's not provable. What definition of S do you use??

vijay123 said:
entropy is the measure of disorder...but i just cannot visualize it in a practical standpoint...i mean...the disorder taking place with soo many particles when heated and yet all i know is that s=q/t...can anyone explain this concept better to me?

one more question...how does one prove using statistical mechanics that
s=(boltzmann's constant)lnw...were w is the number of macrostates...

i was tryng to do calculus based entropy probelms...but can do them in refrence to s=(boltzmann's constant)lnw, by counting the number of macrostates available??

and is it possible to find th number of macrostates by equating (boltzmann's constant)lnw=q/t...is this a very large number?
Use the thermodynamic definition of entropy and forget about the concept of disorder: dS = dQ/T

AM

You'll need to use the definition of temperature $$\frac{1}{k_{B}T} = \frac{\partial\ln\Omega}{\partial U}$$ as well.

...though you might want to show where that comes from, it might be kinda easy otherwise...

## 1. What is entropy in thermodynamics?

Entropy is a measure of the disorder or randomness in a system. In thermodynamics, it is a measure of the unavailable energy in a closed system.

## 2. How can I visualize entropy?

Entropy can be visualized as the level of disorder or randomness in a system. For example, a neat and organized room has low entropy, while a messy and cluttered room has high entropy.

## 3. How does entropy change in a system?

Entropy tends to increase in a closed system over time, as energy is dispersed and the system becomes more disordered. However, if energy is input into the system, the entropy may decrease temporarily before eventually increasing again.

## 4. What is the relationship between entropy and temperature?

There is a direct relationship between entropy and temperature. As the temperature of a system increases, the entropy also increases. This is because at higher temperatures, molecules have more energy and can move more freely, increasing the system's disorder.

## 5. How is entropy used in thermodynamics?

Entropy is an important concept in thermodynamics as it helps us understand how energy flows and changes in a system. It is used in calculations to determine the efficiency of processes and to predict the direction of spontaneous changes in a system.

• Thermodynamics
Replies
18
Views
4K
Replies
4
Views
3K
• Thermodynamics
Replies
22
Views
2K
• Classical Physics
Replies
4
Views
623
• Thermodynamics
Replies
1
Views
754
Replies
3
Views
2K
• Thermodynamics
Replies
4
Views
2K
Replies
7
Views
2K
• Thermodynamics
Replies
3
Views
1K
• Computing and Technology
Replies
7
Views
856