- #1
Monsterboy
- 303
- 96
Last edited:
Monsterboy said:I have often heard people say "entropy depends on the observer."
Lord Jestocost said:Why should entropy depend on the observer?
"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T of ∫C_{p}/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)
Lord Jestocost said:Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.
Lord Jestocost said:The question is: Does entropy depend on the observer?
When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:
S_{1} = S_{0} + ∫δQ_{rev}/T (integration from 0 to 1)
The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).
Entropy is a measure of disorder or randomness in a system. It is commonly used in physics, information theory, and statistics to describe the uncertainty or lack of information about a system.
Entropy can be seen as a measure of ignorance because it reflects the amount of uncertainty or lack of knowledge about a system. A high entropy value indicates a high level of ignorance, while a low entropy value indicates a high level of knowledge.
The formula for calculating entropy depends on the specific context in which it is being used. In information theory, it is calculated as the negative sum of the probabilities of all possible outcomes multiplied by their logarithms. In thermodynamics, it is calculated using the Boltzmann's entropy formula.
In certain contexts, entropy can be reduced or eliminated through the introduction of additional information or constraints. For example, in information theory, the use of coding techniques can reduce entropy and increase the efficiency of data transmission. However, in thermodynamics, the second law of thermodynamics states that entropy of a closed system always increases and cannot be eliminated.
Entropy has various applications in different fields such as physics, chemistry, biology, information theory, and statistics. In physics, it is used to describe the randomness and disorder in a system. In information theory, it is used to measure the uncertainty of data. In biology, it is used to describe the complexity and diversity of ecosystems. In statistics, it is used to measure the amount of variability in a dataset.