Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
In heat engine we define a heat source from where heat is transferred to the system, we say that heat source has a temperature ##T_h## , When we define a Carnot heat engine, the first process we have is an isothermal expansion and we say heat has to come in system through this process and here...
According to Everett-interpretation or many world interpretation of quantum mechanics, each decision an observer makes, the world splits into two parallel universes, let’s say an observer in some point in Spacetime is tests the Schrödinger’s cat experiment, in one branch of the universe the cat...
for a)##\Delta S=\mp \int_{T_i}^{T_0}\frac{C(T)}{T}dT## and ##\Delta S_{th}=\int_{T_i}^{T_0}\frac{dQ}{T_0}dT## so ##S_{univ}=\Delta S_{th}+\Delta S##.
What is ##dQ## equal to ? I don't know how to answer question b).
Thank you for your help.
If you take a system with fixed entropy S0 and let it evolve, it reachs equilibrium. Let Ueq be the energy of the system at equilibrium.
Now take the same system with fixed energy U=Ueq (S is not fixed anymore), how do you know that the equilibrium reached is the same as before, that means with...
Hi.
Processes involving a friction force whose direction somehow depends on the direction of the velocity, such as ##\vec{F}=-\mu\cdot\vec{v}##, aren't symmetric with respect to time reversal. If you play it backwards, this force would be accelerating.
On the other hand, friction dissipates...
Problem Statement: 1 kg of water at 273 K is brought into contact with a heat reservoir at 373 K. When the water has reached 373 K, what is the entropy change of the water, of the heat reservoir, and of the universe?
Relevant Equations: dS=Cp*(dT/T)-nR*(dP/P)
dS=Cv*(dT/T)+nR*(dV/V)
I am...
In a recent study (https://phys.org/news/2018-08-flaw-emergent-gravity.html) it has been discovered an important flaw in Emergent/Entropic Gravity because it has been discovered that holographic screens cannot behave according to thermodynamics...
But then, doesn't this also invalidate...
The multiplicity of states for a particle in a box is proportional to the product of the volume of the box and the surface area of momentum space.
$$ \Omega = V_{volume}V_{momentum}$$
The surface area in momentum space is given by the equation:
$$p^{2}_{x}+ {p}^{2}_{y}+{p}^{2}_{z} =...
Although I've read many papers that propose a relation between action and entropy, I've been told that there is no generally accepted relation in physics.
But how/why are these concepts unrelated?
What about nobel laureate Frank Wilczek? He proposes that entropy and action are closely related...
Hi,
consider an adiabatic irreversible process carrying a thermodynamic system from initial state A to final state B: this process is accompanied by a positive change in system entropy (call it ##S_g##). Then consider a reversible process between the same initial and final system state. Such...
I tried following:
$$ dS_{\text{total}} = |\frac{dQ}{T_c}| |\frac{dQ}{T_H}| $$
where ## T_h ## is temperature of hot water and ## T_c ## is temperature of cold water. Coefficient for water wasn't provided in the assignment so i used following value c = 4190 J/kgK.
$$ dS_{\text{total}} =...
Help!
Hi, I need
in the secodn law of thermodynamic, we have the ENTROPY "S".
Well, I need help for this:
We have dS ≈ dQ
Then we have dS = λ *dQ
where we have λ = λ (T, ... )
I have to demostrate that :
λ = 1/T , where T = temperature.
Thanks for the advices and help!
Hey guys, so I am reading this book and on pages 89-90, the author says:
"Increasing temperature correspond to a decreasing slope on Entropy vs Energy graph", then a sample graph is provided, and both in that graph and in the numerical analysis given in page 87 the slope is observed to be an...
i don't really understand why S of the universe must be always positive,i know that only reversible process have constant entropy but why real proceses always increase S in the universe?
sorry for bad english I am not from USA or UK
Homework Statement
A gas sample containing 3.00 moles of Helium gas undergoes a state change from 30 degrees celsius and 25.0 L to 45 degrees celsius and 15.0 L. What is the entropy change for the gas (ideal gas)? For He, Cp = 20.8 J/k*mol
Homework Equations
ΔS = Cv*ln(Tf/Ti) + nR*ln(Vf/Vi) =...
Hi everyone, I have a few questions I'd like to ask regarding what I have read/heard about these two definitions of entropy. I also believe that I have some misconceptions about entropy and as such I'll write out what I know while asking the questions in the hope someone can correct me. Thanks...
There are two aspects of uncertainty
(a) how far different from the situation where all possibilities are of equal probability
(b) how spread out the values are.
In discussions about (Shannon) entropy and information, the first aspect is emphasized, whereas in discussions about the standard...
1. Homework Statement
if a rigid adiabatic container has a fan inside that provides 15000 j of work to an ideal gas inside the container,
does the change in entropy would be the same as if 15000 j of heat are provided to the same rigid container (removing the insulation)?
2. Relevant...
My background is that I'm an applied mathematician and engineer, self-taught in GR and QFT. It's an old idea, in some dozen or so SciFi books. But I'm looking for a mathematical framework for handling it. The second law of thermodynamics, that entropy always increases in a closed system, can be...
Imagine there is an radiation concentrator (winston cone) surrounded with extremely many layers of foil for radiation insulation, except at the smaller opening. Every part of the setup is initially in thermal equilibrium with the surroundings. The amount of thermal radiation flowing through the...
From a heuristic standpoint it makes sense that when a system goes from being periodic to chaotic, the occupied volume of the phase space increases (while not violating liouville theorem). Since the volume of phase space is proportional if not equal to the entropy, shouldn’t entropy always...
when selecting rare entropy sources for trng and one can see similarities trough an applied hidden markov model, will it be still good entropy?
(structure is the same, even though type of source input is different)
If we have two sequences s1 and s2, both of N coin tosses, is the entropy of getting two sequences that are exactly the same then lower than sequences of which can be said that they differ by x incident tosses? Is the entropy of getting sequences s1 and s2 that differ by N/2 tosses the highest...
Hello;
If a system receives a thermal energy Q, can it keep its entropy constant (that is, with equal value before it receives the energy) without wasting the energy received?
I just read a book by nuclear physicist Carlo Rovelli on the subject of "Time" and he says that 'entropy' is the only non-reversible process in the basic equations of physics, and he believes time and entropy are related (if I understand him correctly). So this started me thinking on entropy...
Homework Statement
Derive an expression for the change of temperature of a solid material that is compressed adiabatically and reversible in terms of physical quantities.
(The second part of this problem is: The pressure on a block of iron is increased by 1000 atm adiabatically and...
Homework Statement
In a monatomic crystalline solid each atom can occupy either a regular lattice site or an interstitial site. The energy of an atom at an interstitial site exceeds the energy of an atom at a lattice site by an amount ε. Assume that the number of interstitial sites equals the...
In a free expansion, I know that we cannot use the equation dS=dQ/T...(1). Instead we use dS>dQ/T...(2).
The question is that why we can use △S=ncᵥln(T_f/T_i)+nRln(V_f/V_i) , which is derived from the equation(1), to calculate the entropy change? Shouldn’t it be a inequality too?
Hello,
I am trying to figure out where my reasoning falls apart in this thought experiment:
To determine if a process "A" is reversible (or at the very least internally reversible), I try to picture a reversible process "B" that involves only heat transfer and links the same two endpoints that...
In Brazil Nut effect /Granular convection the large grains move upward and the smaller ones go downward. This sorting is supposed to reduce the multiplicity of this system. But according to the second law of thermodynamics, entropy and multiplicity of the system should increase.
I am looking...
It is sometimes said that entropy is "unlikely" to return to the "pattern" that it came from, for instance: if we have a vat with blue gasmolecules and white gasmolecules separated by a slit, if we remove the slit, the blue and white molecules will mingle, unlikely to return to their separated...
Jacob Bekenstein asserts that the entropy of a black hole is proportional to its area rather than its volume. Wow.
After watching Leonard Susskind's video 'The World as a Hologram', it seems to me that he's implying that we are all black hole stuff. Perhaps we (our galaxies and their black...
The resolution for Maxwell's demon paradox is that the demon has limited memory and the demon will eventually run out of information storage space and must begin to erase the information it has previously gathered. Erasing information is a thermodynamically irreversible process that increases...
Homework Statement
I'm attempting to calculate the translational entropy for N2 and I get a value of 207.8 J/Kmol. The tabulated value is given as 150.4 and I am stumped as to why the decrepancy.
T = 298.15 K and P = 0.99 atm and V = 24.8 L
R = 8.314 J/Kmol[/B]
Homework Equations
Strans =...
Making use of the partition function, it is straight forward to show that the entropy of a single quantum harmonic oscillator is:
$$\sigma_{1} = \frac{\hbar\omega/\tau}{\exp(\hbar\omega/\tau) - 1} - \log[1 - \exp(-\hbar\omega/\tau)]$$
However, if we look at the partition function for a single...
Homework Statement
I'm asked to compute the molar entropy of oxygen gas @ 298.15 K & 1 bar given:
molecular mass of 5.312×10−26 kg, Θvib = 2256 K, Θrot = 2.07 K, σ = 2, and ge1 = 3. I'm currently stuck on the vibrational entropy calculation.
Homework Equations
[/B]S = NkT ∂/∂T {ln q} + Nk...
I have my first question. It's about entropy in the Carnot cycle and I'll try to be direct.
The equal sign in the Carnot cycle efficiency equation is related to the fact that the total entropy doesn't change at the end of the whole cycle (being related to the fact that the heat exchanges occur...
Homework Statement
We put 1kg iron of temperature 100 Celsius into container with 1kg of ice, temperature 0 Celsius. What is state of the system after reaching equilibrium? Calculate change of entropy.
Coefficient of melting of ice (c_L) is 330 kJ/kg, coefficient of heat transfer of iron (c_I)...
Homework Statement
Calculate the change in molar entropy of steam heated from 100 to 120 °C at constant volume in units J/K/mol (assume ideal gas behaviour).
Homework Equations
dS = n Cv ln(T1/T0)
T: absolute temperature
The Attempt at a Solution
100 C = 373.15 K
120 C = 393.15 K
dS = nCvln...
In deriving the Carnot Efficiency, the assumption is made that theoretically most efficient engine will generate no net entropy, meaning that the entropy that enters the system during heat absorption must equal the entropy that leaves the engine during heat rejection. Why is the case? Why would...
Suppose you have an experiment that measures the property of an atom as a whole, maybe you can put it through a double-slit or measure its spin, whatever. Presumably that will collapse the wavefunction that you used to describe the atom in that experiment. Would this entail that in the process...
Homework Statement
The change in entropy is zero for:
A. reversible adiabatic processes
B. reversible isothermal processes
C. reversible processes during which no work is done
D. reversible isobaric processes
E. all adiabatic processes
Homework Equations
## dS = \frac{dQ}{T} ##
The Attempt...
I'm looking for a book to help me understand a project I'm working on measuring the magnetocaloric effect. I'd like to understand a bit more about the link between magnetism and entropy. I'm a third year bachelor student so I've studied no quantum mechanics (yet), but I'm not against doing so if...
If the universe keeps expanding and eventually ends in a "big freeze" or heat death, does this contradict the third law of thermodynamics?
The third law of thermodynamics states that a crystal at absolute zero has zero entropy. Since the entropy of the universe can never decrease, as the age...
Why can sometimes entropy remain constant with increase of temperature and vice versa?Entropy implies transfer of heat and heat must increase with temperature.I am unable to intuitively understand.
Why is entropy lost by hot water less than the entropy gained by the cold water?From perspective of energy,why is it better to take water and heat it to a temperature than it is to mix hot water and cold water to get a particular temperature.
My question is regarding a few descriptions of Entropy. I'm actually unsure if my understanding of each version of entropy is correct, so I'm looking for a two birds in one stone answer of fixing my misunderstanding of each and then hopefully linking them together.
1) A measure of the tendency...
Every year since the 90's I come back to some of my pet topics in physics, like statistical physics.
This time it was the reading of a Wikipedia article on entropy that surprised me.
The derivation of the second law from the Gibbs entropy was unknown to me.
I didn't know how heat, how change of...
Hi,
I am quite confused about followed question,
I think scientist think the last scattering surface was dense plasma at the temperature of 3000K. If the today's universe much cooler and less dense then "the last scattering surface" how can anyone says entropy increased by time? Isn't universe...