Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Hey guys! This is problem from Callens Thermodynamics textbook and I'm stuck with it.
My goal was to get a expression for the entropy ##S## which is dependent on ##T## so I can move into the ##T-S##-plane to do my calculations:
I startet by expressing the fundamental equation as a function of...
In classical statistics, we derived the partition function of an ideal gas. Then using the MB statistics and the definition of the partition function, we wrote:
$$S = k_BlnZ_N + \beta k_B E$$, where ##Z_N## is the N-particle partition function. Here ##Z_N=Z^N##
This led to the Gibb's paradox...
One of the most fundamental equations in chemical thermodynamics states: $$ \Delta_rH_m^⦵ = \Delta_rG_m^⦵ + T \Delta_rS_m^⦵ $$
If we look at this equation in context of net chemical reaction in electrolytic or galvanic cell, it is usually interpreted as follows: Enthalpy of reaction denotes...
Hey guys! I'm currently struggling with a specific thermodynamics problem.
I'm given the entropy of a system (where ##A## is a constant with fitting physical units): $$S(U,V,N)=A(UVN)^{1/3}$$I'm asked to calculate the specific heat capacity at constant pressure ##C_p## and at constant volume...
So what I did was find the change in Q per min.
Mass melted per min * latent heat capacity = Q per min = 11.5 kg /min * 3.4*10^5 J/kg = 3910000 J/min
Now the equilibrium temperature is 100 degrees Celsius or 373.15 degrees kelvin.
If I do 3910000 J/min / 373.15 K I get 10478 J/(K*min).
This...
In the industry, coal and other fuels are typically represented by
- their C, H, O, N, S elemental composition for the combustible part
- the composition of the ashes (SiO2, Al2O3, Fe2O3, ...)
- the Low Heat Value (LHV) which is the heat that can be extracted from combustion product
With...
Context
Boltzmann first defined his entropy as S = k log(W). This seems to be pretty consistently taught. However, the exact definitions of S & W seem to vary slightly.
Some say S is the entropy of a macrostate, while others describe it as the entropy for the system. Where the definition of...
Hi all,
First post here. I'm a casual physics enthusiast, but I've been reading and thinking a lot about this topic lately.
The thing I'm most interested in is the fact that black hole formation involves the simultaneous limits of two things: time dilation and the information bound. I find it...
It looks very easy at first glance. However, the variable S is a variable in the given expression. I have no clue to relate the partial derivatives to entropy and the number of particles.
Is the purpose of the 0th, 1st & 2nd Laws of Thermodynamics simply to legitimate the thermodynamic properties of Temperature, Internal Energy & Entropy, respectively?
It seems that all these laws really do is establish that these properties are valid thermodynamic state properties and the...
In the book for our thermodynamics, it states that a process that is internally reversible and adiabatic, has to be isentropic, but an isentropic process doesn't have to be reversible and adiabatic. I don't really understand this. I always thought isentropic and reversible mean the same thing...
I've calculated the change in the entropy of material after it comes in contact with the reservoir:
$$\Delta S_1 = C \int_{T_i+t\Delta T}^{T_i+(t+1)\Delta T} \frac{dT}{T} = C \ln{\frac{T_i+(t+1)\Delta T}{T_i+t\Delta T}}$$
Now I would like to calculate the change in the entropy of the...
We know that
$$dU=\delta Q + \delta W$$
$$dU = TdS - pdV$$
So from this:
$$dS = \frac{1}{T}dU + \frac{1}{T}pdV \ (*)$$
For an ideal gas:
$$dU = \frac{3}{2}nkdT$$
Plugging that into (*) and also from ##p=\frac{nRT}{V}## we get:
$$S = \frac{3}{2}nk \int^{T_2}_{T_1} \frac{1}{T}dT +...
I have been able to get everything except entropy. I know it's not zero. I know I have to find a reversible path to calculate it, but keep coming up with strange values so I don't think I'm doing it correctly.
can I do CpdT/T + CvdT/T = ds? I am having trouble calculating my P2 (I know my final...
I have used the Lagrange multiplier way of answering. So I have set up the equation with the constraint that ## \sum_{x}^{} p(x) = 1##
So I have:
##L(x,\lambda) = - \sum_{x}^{} p(x)log_{2}p(x) - \lambda(\sum_{x}^{} p(x) - 1) = 0##
I am now supposed to take the partial derivatives with respect...
I understand entropy as a movement from order to less order and that the universe's entropy is increasing.
So I was wondering, life takes molecules and organizes them into organisms, so isn't life a reversal of entropy?
When trying to describe why the entropy goes up for a irreversible process, such as gas expanding into a vacuum, it seems fairly easy at a high level. the valve between the two chambers opens, the free expansion occurs, the pressure drops proportional to the volume change and the temp remains...
I'm reading Brian Greene's latest book 'Until the End of Time' (I'll pause here while you finish groaning at yet another layperson reading popularist physics books.) In it, he's describing entropy in a way I've never heard before and it clarifies something that's always stuck on my craw about...
Hi, I'm new to PF and not really sure which forum may be the most appropriate to find people with an interest in probability and entropy. But the title of this forum looks promising. If you share an interest in this topic would be delighted to hear from you.
Thinking of the common language notion of "entropy" as "uncertainty", how can running a simulation based on a probability model implement entropy increasing? After all, the simulation picks definite outcomes to happen, so (intuitively) there is less uncertainty about the future as definite...
Question
---
So I've done a calculation which seems to suggest if I combine the system of a measuring apparatus to say an experimenter who "reacts" to the outcome of the the measurement versus one who does not. Then the change in entropy in both these situations is bounded by:
$$ \Delta S_R...
I thought that would be something like, using similar counts from Einstein solid, ##S = kln(\frac{(q+N-1)!}{q!(N-1)!})##
Where q is ##hv##
v is frequency
But the results are not similar, so i am little stuck
I am struggling to understand Callen's explanation for stability, I understand that the concavity of S(U) must be negative because otherwise we can show that this means that the temperature increases as the internal energy decreases (dT/dU<0) but I cannot understand equation (8.1) which...
Not sure whether this is the right category but bear with me.I've seen graphics where the information increase with time of evolution is projected like the one in the link below from Carl Sagan's book
https://www.researchgate.net/figure/Page-from-the-book-of-Carl-Sagan-21_fig2_322153131
Now I...
The following paper appeared earlier this year on arxiv, entitled "Islands in Schwarzschild Black Holes":
https://arxiv.org/pdf/2004.05863.pdf
First, a bit of background: this paper appears to be part of a larger research effort aimed at resolving the black hole information paradox by showing...
In special relativity, observers can disagree on the order of events - if Alice thinks events A, B and C are simultaneous, Bob can think A happened before B which happened before C, and Carlos thinks C happened before B which happened before A - provided A, B and C are not causally connected, of...
i consider a pair of two level particles which can be up or down. this pair is described in the
tensor product by the unitary vector (cos(\theta) (du + ud) + sin(\theta) (dd + u)) /\sqrt 2
i take its density matrix , trace it on one of the two particles and find the density matrix
of each one...
The conclusion of my attempt I am listing below is that there do exist entropies for both but I am not sure.
$$dU=TdS-pdV$$
$$dS=\frac{dU}{T}+\frac{p}{T}dV$$
Therefore, gas A:
$$S=\frac{{\Delta}U}{T}+\alpha_A(\frac{-N}{{\Delta}V})$$
Gas B...
So I've answered the first question and I got a final temp of 42.06 Celsius.
Now for this second one, I don't know why I am getting it wrong:
Im doing 0.215*ln(315.06/291.46) + 1*ln(315.06/319.91)
But it says I am wrong. What about my process is faulty?
(not a paradox nowadays, but it was an issue for years)
https://en.m.wikipedia.org/wiki/Gibbs_paradox
It's not a question about a formula. I don't understand the motivation in physics to claim Gibbs mixing "paradox", the discontinuity point. What bothers the physicist to ask for a continuous...
Maxwell's demon measures the position and velocity of the particle. How can it do that when it violates the uncertainty principle? Does that mean uncertainty principle is unavoidable otherwise we will violate II law of thermodynamics as in the case of Maxwell's demon?
Actually i am trying to see what the first equation to the entropy means, maybe N1 remets to the part 1 (the left suppose) of the system? (or the molecules type 1?)
I am not sure about the equations i will do below, probably it will be wrong, anyway.
∂S/∂U1 = 1/T1 = 3NR/2U1
Okay, it will give...
There is five physically possible entropy to exist, and five entropy which can't be real, find it all.
I could found just four entropy, what is the another?
B, H and J:
S(λU,λV,λN) ≠ λS(U,V,N)
D:
∂S/∂U < 0
what is the another?
(another or other??)
We need to find the system's entropy variation.
I don't think i understood pretty well what is happening in this process, can someone help me how to start?
I came upon a realization recently.
The early universe is always described to have begun in a state of extremely low entropy and it's been increasing ever since.
But the same amount of stuff exists now as it did back then. Only thing that's changed is how big the universe is now vs then.
So...
Sketch a qualitatively accurate graph of the entropy of a substance
as a function of temperature, at fixed pressure. Indicate where the
substance is solid, liquid, and gas . Explain each feature of the graph briefly.
What you think about?:
dU = -P*dV + Q*dS (1)
V = C*T => dV = C*dT
Nfk*dT/2 +...
Does increase the size of the solid body increase its entropy? I was thinking about it using the Einstein model of solid.
S = k*lnΩ
Ω = (q+n-1)!/((q)!(n-1)!)
I am not sure how this question should be answer, i think if we talk about rigid bodies, the question don't even have sense, but about...
Sorry for being fuzzy here, I started reading a small paper and I am a bit confused. These are some loose notes without sources or refs.
Say we start with a collection F of features we want to trim into a smaller set F' of features through information gain and entropy (where we are using the...
I read in a book "Quantum Space" by Jim Baggot, page 290, that the entropy of an object is inversely proportional to its temperature. (He was describing the temperature of a black hole. Does this statement only apply to black holes?) No doubt he is correct, but wouldn't an increase of energy...
How did you find PF?: random Brownian motion
Is randomness real or is it simply defined as such due to our inability to perceive hyper complex order? Randomness is a troublesome word. I'd feel better if I knew it was an objective phenomenon and not merely a placeholder description of...
As a systems engineer I have thought a lot about entropy in trying to get a better intuitive sense for what it is, at a more macro level than it is usually discussed. I have some ideas and am looking for a forum to present and explore them with others. I wish to discuss more from an...
I am a biology undergraduate interested in abiogenesis.
The entropic explanation for the origin of life is that life is allowed to exist because it increases universal entropy.
I am curious about how far we can take this theory down.
How can you explain the emergence of atoms and atomic...
If we know the molecular structure of a complex chemical (organic), can we calculate the dissociation energy for each and every bond somehow? Also, can we calculate the standard etropy of the same molecule? These information would be needed to calculate Gibbs free energy for reactions of a...
Previous of this problem, there was another problem. that is "What is the change in Temperature of van der Waals gas in free expansion?".
I got them.
It was
C_V dT= -aN^2/V^2 dV
Then, I got
T=T0-aN^2/2VC_V
So i knew that the Temperature is decreased by free expansion in adiabatic process.
Then I...
I'm not sure about my proof. So please check my step. I used log as a natural log(ln).
Specially, I'm not sure about "d/dt=dρ/dt d/dρ=i/ħ [ρ, H] d/dρ" in the second line. and matrix can differentiate the other matrix? (d/dρ (dρ lnρ))
Hello to everyone, I'm studying thermodinamics and I would like to understand better the meaning of entropy and how to calculate it.
I know that if A and B are two possible states of a system, the equation whcih defines variation of entropy from A to B is...