Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Homework Statement
Hey guys,
So I have this equation for the entropy of a classical harmonic oscillator:
\frac{S}{k}=N[\frac{Tf'(T)}{f(T)}-\log z]-\log (1-zf(T))
where z=e^{\frac{\mu}{kT}} is the fugacity, and f(T)=\frac{kT}{\hbar \omega}.
I have to show that, "in the limit of...
Homework Statement
Hey guys,
Here's the question. For a distinguishable set of particles, given that the single particle partition function is Z_{1}=f(T) and the N-particle partition function is related to the single particle partition function by Z_{N}=(Z_{1})^{N} find the following...
There is a container containing water (state 1) which is being stirred. There is a temperature rise (state 2) due to stirring. It is required to find out change in entropy of the system if the process is reversible. Since, there is no heat transfer there would be no change in entropy due to...
It is well-known that with known marginal probabilities a_{i} and
b_{j} the joint probability distribution maximizing the entropy
H(P)=-\sum_{i=1}^{m}\sum_{j=1}^{n}p_{ij}\log{}p_{ij}
is p_{ij}=a_{i}b_{j}
For m=3 and n=3, a=(0.2,0.3,0.5), b=(0.1,0.6,0.3), for example,
\begin{equation}...
Homework Statement
Ten kmol per hour of air is throttled from upstream conditions of 25°C
and 10 bar to a downstream pressure of 1.2 bar. Assume air to be an ideal gas with Cp= (7/2)R.
(a)What is the downstream temperature?
(b)What is the entropy change of the air in J mol-1K-1?
(c)What...
The 2nd law of thermodynamics state that entropy increases with time and entropy is just a measure of how hard it is to distinguish a state from another state (information theoretical view) or how hard it is to find order within a system (thermodynamic view). There are many ways to view entropy...
To preface my question, I know it is related to the Gibbs paradox, but I've read the wikipedia page on it and am still confused about how to resolve the question in the particular form I state below.
Suppose a completely isolated ideal gas consisting of identical particles is confined to a...
I've always been slightly confused by the Second Law of Thermo. For example, with Maxwell's Demon, where a demon controls the partition between two gas chambers to select all the fast moving particles into one chamber, the Second Law is not violated because the demon's actions and thought...
Homework Statement
A compressor processes 1.5kg/min of air in ambient conditions (1 bar and 20ºC). The compressed air leaves at 10bar and 90ºC. It is estimated that the heat losses trough the walls of the compressor are of 25kJ/min. Calculate:
a) The power of the compressor
b) The...
1. A photon that emerges when an electron jumps one orbital down -- will have a fixed energy
...i.e. the different between the (potential) energy of the orbitals.
However a "free/unbound" photon can have any energy level.
Is that correct?
2. What is the lowest level of energy a...
Hi,
Under standard conditions, why does water have higher entropy than helium? Isn't helium a gas? I understand that water has more atoms, but it seems more ordered and is a liquid. I'm not sure how a qualitative analysis could lead to the conclusive result that water is higher in entropy...
Homework Statement
One end of a metal rod is in contact with a thermal reservoir at 695K, and the other end is in contact with a thermal reservoir at 113K. The rod and reservoirs make up an isolated system. 7190J are conducted from one end of the rod to the other uniformly (no change in...
Homework Statement
Calculate the variation of entropy in the following processes:
a) Heating of 18 kg of water from 15 to 40ºC at ambient pressure.
b) Compression of 9 kg of water from ambient pressure to 7atm at the temperature of 15ºC.
Homework Equations
ΔS=Cp*ln(T_final/T_initial)...
For a thermodynamic system there exists a function called entropy S(U,N,V) etc.
We then define for instance temperature as:
1/T = ∂S/∂U
μ = ∂S/∂N
etc.
When taking these partial it is understood that we only take the derivative of S wrt the explicit dependece on U,N etc. right? Because...
If we have a qubit in a mixed state, let say 1/2(+><+)+1/2(-><-) and we measure it. Is then the result a pure state + or - ? If this is the case, then the entropy of the system decreases.
Now the question another way round is :
Suppose we measure a quantum system without gaining...
Hello, I am trying to understand a short literature article (doi: 10.1021/ja01635a030). I am not sure how much liberty I have to reproduce its contents here, and I can't explain it here because I don't understand it -- which is why I have this question.
I believe it is proposing that a...
I'm interested in the ultimate fate of the universe. And it seems that the most prevalent theory is the Big Freeze.
From what I can gather the BF is caused by dark energy making the universe expand to the point that stars can no longer form, resulting in cold dark space filling the universe...
Hello Community,
I have a question that I'm struggling to get clarification on and I would greatly appreciate your thoughts.
Big bang theories describe an extremely low thermodynamic entropy (S) state of origin (very ordered).
Question: Is the big bang considered to be a high or low shannon...
Homework Statement
Considering entropy as a function of temperature and volume and using the Maxwell relation;
$$ \left(\frac{\partial S}{\partial V}\right)_T = \left(\frac{\partial p}{\partial T}\right)_V$$
Show that the entropy of a vessel is given by;
$$ S= R...
Hello everyone,
I've been reviewing some concepts on Thermodynamics and, even though I feel like I am gaining a level of comprehension about the subject that I could not have achieved before as an undergraduate, I am also running into some situations in which some thermodynamic concepts seem to...
Hi everyone!
I have a little problem for an upcoming exams, and I think I need just small hints to solve it.
My problem is that I have to write about ten/fifteen pages about SUPERDENSE CODING and QUANTUM CRYPTOGRAPHY, and my professor has taken for granted that these are strongly linked to...
Quantum entropy and ...??
Homework Statement
My problem is that I have to write about ten pages about SUPERDENSE CODING and QUANTUM CRYPTOGRAPHY, and my professor has taken for granted that these are strongly linked to quantum entropy. He never told us why! Indeed he talked about that as...
Is entropy a measure of "disorder"?
In textbooks I never saw a definition of entropy given in terms of a "measure of disorder", and am wondering where this idea comes from? Clausius defined it as an "equivalent measure of change", but I do not see the relation with a concept of "order" or...
Hi All,
I am not sure if this is the right section to post this question but it does involve probability..so please redirect me if necessary.
I am currently looking at the Robinson et al. (2013) paper on rank vector entropy in MEG (doi: 10.3389/fncom.2012.00101). Due to my lack of...
Okay, I am considering a cycle, where the working fluid is an ideal gas, with heat capacities Cv and Cp, the cycle consists of: isochoric increase in volume, adiabatic expansion back to initial pressure and a isobaric compression back to initial conditions.
Questions:
-
q1) I am asked to...
Normally due to H+ being the reference state in solution, all 'standard molar' state variables and 'standard value of formation' state variables are 0 for it. But H2(g) has a standard enthalpy of formation = 0 and standard molar entropy of 115 Jmol-1K-1. Then shouldn't ΔG°(298) for the reaction...
This article insinuates that the physics community had forgotten Gibbs entropy long ago and has used Boltzmann entropy since. Isn't this nonsense? For me it was always clear that Boltzmann entropy is problematic...
Greetings folks,
I'm new to this forum and to physics in general so apologies if I come off like a greenhorn or if I am posting these questions in the wrong place. I have an Arts background and have never really "gotten" science, but my interest in post-Enlightenment philosophy has led me to a...
From what I've been taught, the entropy of a system is the amount of microstates a macrostate can have.
A microstate refers to the configuration of a system on a microscopic level (energy of each particle, location of each particle), a macrostate refers to the external parameters of that system...
in my thermo class when we were formalizing the definition of temperature (\frac{1}{T}=\frac{∂S}{∂U}), we drew out all the combinations of various slopes and concavities of the ∂S/∂U graphs.
http://imgur.com/cR4V8K8
The shape of this graph i figure should be a reflection of the...
Homework Statement
A system of N distinguishable particles is arranged such that each particle can exist in one of the two states: one has energy \epsilon_{1}, the other has energy \epsilon_{2}. The populations of these states are n_{1} and n_{2} respectively, (N = n_{1}+n_{2}). The system is...
Good night, maybe this is a dumb question, but: how was the entropy before the Big Bang?
1. I know, entropy increases or remains constant, so, was the entropy before the Big Bang, and existence of time an space constant?
2. Why is the entropy in our universe not constant but increasing...
To encode a symbol in binary form, I need 3 bits ,and I have 6 symbols.
So I need 6*3=18 bits to encode "We are" into binary form. As shown in http://www.shannonentropy.netmark.pl/calculate
My question: 3 bits to encode one then I have to use 16 bits, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _.
How to...
Homework Statement
When 1.70 ✕ 105 J of heat enters a cherry pie initially at 20.0°C, its entropy increases by 470 J/K. What is its final temperature?
Homework Equations
The Attempt at a Solution
I have no clue
Homework Statement
Consider two fixed volume bricks of mass m1=2kg and m2=1kg with initial temperatures T1=400K and T2=100K. They are enclosed in a system thermally isolated from the surroundings and are made from a material with a heat capacity cv = 1kJ/kg/K.
A) In Process 1, the bricks are...
I am a general chemistry student and I find thermodynamics fascinating. However, I have a hard time visualizing entropy. Can somebody please explain how an increase in entropy can make a process that is endothermic spontaneous? The typical demonstration of entropy that I have seen is on in...
So from the first law for a closed system,
dU=dQ-dW=dQ-PdV
From the second law,
dS=dQ/T + Sgenerated (i.e. the entropy generated)
Putting expression of dQ from second law into first law,
dU=T*dS-T*Sgen-PdV
If s and v are constant,
dU= -T*Sgen>0
Hence dU<0
This is a...
Homework Statement
45g of H2O(g) are condensed at 100 degrees C, and H2O(l) is cooled to 0 degrees C and then frozen to H2O solid. Find the Change in Entropy
H2O(l): 4.2 J K-1g-1
vaporization at 100 degrees C: 2258 J g-1;
fusion at 0 degrees C: 334 J g
Homework Equations
dS=dq/TThe...
∫dQ/T≤∫dQ(rev)/T * , where both integrals are evaluated between the same thermodynamic coordinates- A and B , say.
- I am having trouble interpreting this inequality.
-( I understand the derivation in my textbook via the Clausius diagram(considering a reversible and an ireversible process...
I posted this question a couple days back, but it got removed because it looked like a homework question (which, I suppose, is flattering, since I came up with it on the way home from work, and I'm not even a student, let alone a teacher)...so I'm going to try to rephrase it -- but because this...
Hi all, could somebody look over my answer please. I'm pulled the equation I used off the internet but can't remember where so I'm not sure what it's called.
I took a picture of my answer as I thought it would be easier to read than fiddling with symbols here.
QUESTION
ANSWER ATTEMPT...
Hi all, could somebody have a look over my answers for this question please? The value I got for the second part seems quite feeble.
Homework Statement
The Attempt at a Solution
Part a)
Key
m = mass
cs = specific heat bronze
cm = specific heat molten bronze
Tfus = melting...
I think I actually have solved it. I was right with the PV=nkT, I believe I previously messed up with the algebra.
Homework Statement
Using the same meathod as in the text, calculate the entropy of mixing for a system of two monatomic ideal gases, A and B, whose relative proportion is...
Fractals are just many iterations of a very basic formula, so they can be described with little information, and yet they are extremely complex given enough iterations.
Can they be described as low entropy despite their complexity?
In statistical mechanics the macro-state of a system corresponds to a whole region in the microscopical phase-space of that same system, classically, that means that an infinity of micro-states relate to a single macro-state. Similarly, given a hamiltonian, a whole surface in the microscopical...
Homework Statement
So as a part of a very long question on homework for thermodynamics, I and some of my friends got into a debate about how to write an expression for entropy at a particular state.
Homework Equations
delS(state 1 to 2)= Cv*ln(T2/T1)+R*ln(V2/V1)
(from constitutive...
I've got a couple of questions about reversible cycles:
So if we have two gaseous systems and have a reversible cycle working between them, then the entropy generation within each gaseous system is zero, right?
Do turbines execute reversible cycles?
Thanks a lot for your help!