Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Hi, as in a previous thread I would like to better understand the Feynman's analysis of brownian ratchet as described here:
https://www.feynmanlectures.caltech.edu/I_46.html
https://en.wikipedia.org/wiki/Brownian_ratchet
Consider the case in which the two boxes (i.e. heat baths) are at the same...
Hi, soppose we have a resistor at a given temperature T connected through a diode to a cell battery.
The voltage accross the resistor due to thermal noise should charge the cell converting termal energy into chemical energy without limits.
Does the above process violate the second law of...
I'd like to check if my reasoning is right here and that the numerical factors in the final result are correct. The disks occupy an effective area ##A = (A_{\mathrm{box}}-2r)^2##, excluding the region of width ##r## at the boundary. The area available to the ##n##th disk is then ##A_n = A - 4\pi...
So I had to find change in entropy of system in reversible isothermal process.
$$T\Delta S_{sys.}=Q\implies \Delta S_{sys.}=nRln\left(\frac{V_2}{V_1}\right)$$
This was good because for isothermal process ##\Delta U=0\implies Q=W##
Then I read this
Throughout an entire reversible process, the...
We know that there is no law of conservation for the entropy. It is quite the contrary: If we have a closed system without exchange of heat the entropy cannot get less. It will reach the max. If we have not a closed system but a stream of entropy only into a system, the entropy will increase...
I've never had any physics class before so please bare with me on my lack of understanding.
I've been thinking about gravity and its relation to entropy lately and was wondering if my thinking is correct.
Entropy seems to be an opposing force to gravity. where gravity is creating gradients...
I'm studying how to compute excess entropy in molecular dynamics (MD). I've found it is needed to compute the two-body correlation function (neglecting high-order terms), the details can be found, for example, in this article.
So the definition of correlation function (CF for short) is
##C(t...
Entropy reduction or quantum phenomena can occur microscopically, but entropy reduction is absolutely impossible by chance, and if a macroscopic object's wave function collapses due to measurement, does that mean that the macroscopic object will never be able to cause quantum phenomena? Even in...
Q： Why the entropy value of this steady flow open system is not equal to zero?
My idea is as represented by the following equation.
$$
\frac{dS_{sys}}{dt}=0,\,\,\,\,dt\ne 0
$$
$$
\therefore dS_{sys}=0\,\,\,\,\,\,\,\,\therefore ∆Ssys=∆Sair=0
$$
$$
\therefore...
As far as I know, entropy could be reversed by the Poincaré recurrence theorem if it had a finite horizon given by some amount of vacuum energy causing an accelerating expansion.
However, I found this lecture by Leonard Susskind () where he tells a way through which the vacuum could decay into...
In the far future there will be most likely a point where a maximal state of entropy will be reached in the universe and after the last black hole evaporates there could be no more structures and no more work could be done.
According to the Poincaré recurrence theorem for a closed universe...
If we have a kg of something that is 100miljon Celsius degrees, and can controlably use this heat somehow, we can sustain life, grow crops, drive steam engines and with these we could build a whole city like New York, we can create a lot of mass with very low entropy, things that are very...
My studies relate with construction engineering and environment improvements and I have a passion about combinatorics and exact sciences. I'm always in touch with the novel things that pop out in science related media. I don't like when people start make politics upon science findings.
I'm the...
Is entropy real? It seems like it's not real because it depends on how you group microstates together into a macrostate, and the way you group them can be arbitrary. For example (at 13:04 of the video below), there are 91,520 microstates in the macrostate “9 in left; 1 in right” but 627,264...
In a discussion about the (change in the) Helmholtz potential being interpretable as the maximum available amount of work for a system in contact with a thermal reservoir (i.e. the free energy), Callen seems to insist this fact is true only for reversible processes. Why should this be? I...
In his classic textbook, Callen remarks that
I have labelled the claims (1) and (2). I am not sure about either. For the first, I have tried to proceed as follows (all equations are from Callen's second edition and all 0 subscripts are with respect to some reference state of an ideal gas):
I...
In Chapter 5 of his famous textbook on thermodynamics, Callen argues for the "equivalence" of the maximum entropy (Max-Ent) principle and the minimum energy (Min-En) principles. I quote from Callen first:
As far as I know (though Callen never makes this explicit in what, I think, represents...
I am continuing to try to understand maximum work reversible processes (and a subset thereof -- Carnot cycles) better. I am here curious about the following system.
My question is about how I can know/prove that there exists a way to take the gas (the primary subsystem) reversibly with respect...
This question was, effectively, asked here (please refer to that question for additional context); however, I don't think the given answer is correct (or at least complete) despite my having added a bounty and having had a productive discussion with the answerer there. In particular, I don't...
Hello everyone,
I am seeking some clarification regarding a question related to thermodynamics and statistical mechanics. My understanding is that when we combine two identical boxes with the same ideal gas by removing the wall between them, the resulting system's entropy stays the same...
The Bekenstein Bound places a upper limit on the amount of entropy that a given volume of space may contain.
This limit was described by Jacob Bekenstein who tied it quite closely to the Black Hole Event Horizon.
Put simply, black holes hold the maximum entropy allowed for their volume. If you...
What does entropy in the following sentence means? Does it mean the same as the term "information content" before it? Is entropy more technical a term than information content?
He remembered taking a class in information theory as a third-year student in college. The professor had put up two...
The starting point is the identity
$$\left(\frac{\partial u}{\partial T}\right)_n = T\left(\frac{\partial s}{\partial T}\right)_n.$$
I then try to proceed as follows:
Integrating both with respect to ##T## after dividing through by ##T##, we find
$$ \int_0^T \left(\frac{\partial s}{\partial...
Dear everyone, I wish to discuss in this thread a classic/semi-classic interpretation on the origin of Bekenstein-Hawking entropy and the related resolution to Hawking's information missing puzzle, which were published in Nucl.Phys.B977 (2022) 115722 and Nucl.Phys.B990 (2023) 116171 after...
Hi everyone!
It's about the following task:
Calculate the molar entropy of H2O(g) at 25°C and 1 bar.
θrot = 40.1, 20.9K, 13.4K
θvib=5360K, 5160K, 2290K
g0,el = 1
Note for translational part: ln(x!) = x lnx - x
Can you explain me how to calculate this problem?
I would like to calculate the entropy or enthalpies (steam, specific and inner energy) using the SRK [suave-redlich-kwong] equation, the Wilson approximation and (if necessary) the Antoine equation. and the Clausius-Clapeyron equation for a mixture of 0.199 mol/l nitrogen and 0.811 mol/l carbon...
Hello,
is someone able to explain why these two are wrong. I am not sure how to figure out the enthalpy direction as the reaction is not changing state of matter, nor is it changing temperature.
(Please solve without calculating anything)
Thank you
The FAQ by @bcrowell cites an explanation by physics netizen John Baez as to how entropy rises when a star loses heat and contracts. However, the linked explanation falls short of describing the key role that gravity must be playing. The FAQ by @bcrowell discusses why a low-entropy state of...
For this,
I don't understand how we can apply the change in entropy equation for each solid since the ##\frac{dT}{dt}## for each solid will be non-zero until the solids reach thermal equilibrium. My textbook says that the ##\Delta S## for a system undergoing a reversible process at constant...
For this,
Why dose they write the change in entropy equation as ##\Delta S = \frac{Q}{T}##? Would it not better to write it as ##\Delta S = \frac{\Delta Q}{T}##, since it clear that we are only concerned about the transfer of heat in our system while it remains at constant temperature as all...
In the book "Cycles of Time" by Roger Penrose, there is a part of the explanation of entropy that I don't understand.
There are 10^24 balls, half of which are red and the other half blue.
The model is to arrange the balls in a cube with 10^8 balls on each edge.
It also divides the cube into...
In classical statistical physics, entropy can be defined either as Boltzmann entropy or Gibbs entropy. In quantum statistical physics we have von Neumann entropy, which is a quantum analog of Gibbs entropy. Is there a quantum analog of Boltzmann entropy?
I came across the following statement from the book Physics for Engineering and Science (Schaum's Outline Series).
I cannot seem to find a satisfactory answer to the questions.
Is the statement in above screenshot talking about entropy change the statement of Second Law of Thermodynamics or is...
Entropy question.
Take a finite number of identical atoms in a specific volume of space at a moment of time.
Run two thought experiments on this system
scenarios (both time independent)
1: expand the volume of space of the system instantaneously by a factor of 10. The fixed number of atoms...
Unfortunately, I have problems with the following task
For task 1, I proceeded as follows. Since the four bases have the same probability, this is ##P=\frac{1}{4}## I then simply used this probability in the formula for the Shannon entropy...
Two systems A & B (with orthonormal basis ##\{|a\rangle\}## and ##\{|b\rangle\}##) are uncorrelated, so the combined density operator ##\rho_{AB} = \rho_A \otimes \rho_B##. Assume the combined system is in a pure state ##\rho_{AB} = |\psi \rangle \langle \psi |## where ##|\psi \rangle =...
Hi,
Unfortunately I am not getting anywhere with task three, I don't know exactly what to show
Shall I now show that from ##S(T,V,N)## using Legendre I then get ##S(E,V,N)## and thus obtain the Sackur-Tetrode equation?
Hi,
Unfortunately, I have problems with the task 4
In task 3 I got the following
$$ T_f=T_ie^{\Delta S_i - c_i} $$
Then I proceeded as follows
$$ \Delta S = \Delta S_1 + \Delta S_1 $$
$$ \Delta S =c_1ln(\frac{T_ie^{\Delta S_i - c_i}}{T_1})+c_2ln(\frac{T_f}{T_2})$$
$$ \Delta S...
For now it is only about the 1 task
If the task states that:
You can approximate that their dynamics in water resembles that of an ideal gas.
Does it then mean that I can take glucose as the ideal gas and then simply calculate the entropy for the ideal gas?
For a freely expanding ideal gas(irreversible transformation), the change in entropy is the same as in a reversible transformation with the same initial and final states. I don't quite understand why this is true, since Clausius' theorm only has this corrolary when the two transformations are...
Summary: doesn't this decrease entropy ?
Cellulose is known for its hydrophilic quality, which can be explained from the polarity of its hydroxyl groups.
We all know water can overcome the force of gravity through a piece of paper you put in the water.
Correct me if I'm wrong but this is a...
If you were to condense all the energy in the universe into a point, wouldn't the temperature be very high, yet the entropy be very low? Also if you were to spread out all of the energy in the universe, wouldn't the temperature be near zero and the entropy be very high? And this makes entropy...
Summary: Trying to understand the relationship between gravity, thermodynamics and entropy, thank you.
Gravity can take a diffuse cloud of gas filling a given volume of space at equilibrium density and temperature, and turn it into a burning star surrounded by empty space. Does this mean that...
Christoph Schiller, "From maximum force to physics in 9 lines -- and implications for quantum gravity" arXiv:2208.01038 (July 31, 2022).
This paper asserts that nine propositions can be used to derive the Standard Model and GR and can point the way to quantum gravity, although he cheats a bit...
Quantum gates must be reversible.
The usual justification for this is that in QM the time evolution of a system is a unitary operator which, by linear algebra, is reversible (invertible).
But I am trying to get a better intuition of this, so I came up with the following explanation:
In order to...
Now, it's been said that the majority of the entropy in the universe resides within the cumulative entropy of black holes inside the universe. How do they know that?
Now, I'm not so interested in how they determine the black hole's entropy, I know there's a relatively simple formula for that...
Boltzmann entropy definition is given by: $$ S = k_B lnW $$ where ##W## is the weight of the configuration which has the maximum number of microstates.
This equation is used everywhere in statistical thermodynamics and I saw it in the derivation of Gibbs entropy. However, I can't find the...
In the definition of entropy, there are two. One is about the degree of randomness and one is about energy that is not available to do work. What is the relationship between them?
If the Universe could somehow reach a state of infinite entropy (or at least a state of extremely high entropy), would all fundamental symmetries of the physical laws (gauge symmetries, Lorentz symmetry, CPT symmetry, symmetries linked to conservation principles...etc) fail to hold or be...