What is Entropy: Definition and 1000 Discussions

Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.

View More On Wikipedia.org
  1. D

    Simplifying entropy for a harmonic oscillator in the limit of large N

    Homework Statement Hey guys, So I have this equation for the entropy of a classical harmonic oscillator: \frac{S}{k}=N[\frac{Tf'(T)}{f(T)}-\log z]-\log (1-zf(T)) where z=e^{\frac{\mu}{kT}} is the fugacity, and f(T)=\frac{kT}{\hbar \omega}. I have to show that, "in the limit of...
  2. D

    Weird form of entropy using grand partition function for a system

    Homework Statement Hey guys, Here's the question. For a distinguishable set of particles, given that the single particle partition function is Z_{1}=f(T) and the N-particle partition function is related to the single particle partition function by Z_{N}=(Z_{1})^{N} find the following...
  3. P

    Entropy & Work Done: Understanding Reversible Heating/Stirring

    There is a container containing water (state 1) which is being stirred. There is a temperature rise (state 2) due to stirring. It is required to find out change in entropy of the system if the process is reversible. Since, there is no heat transfer there would be no change in entropy due to...
  4. noowutah

    Maximizing the entropy of a constrained distribution

    It is well-known that with known marginal probabilities a_{i} and b_{j} the joint probability distribution maximizing the entropy H(P)=-\sum_{i=1}^{m}\sum_{j=1}^{n}p_{ij}\log{}p_{ij} is p_{ij}=a_{i}b_{j} For m=3 and n=3, a=(0.2,0.3,0.5), b=(0.1,0.6,0.3), for example, \begin{equation}...
  5. gfd43tg

    Entropy balance for air stream

    Homework Statement Ten kmol per hour of air is throttled from upstream conditions of 25°C and 10 bar to a downstream pressure of 1.2 bar. Assume air to be an ideal gas with Cp= (7/2)R. (a)What is the downstream temperature? (b)What is the entropy change of the air in J mol-1K-1? (c)What...
  6. J

    Understanding Entropy and Fluctuation: The 2nd Law of Thermodynamics Explained

    The 2nd law of thermodynamics state that entropy increases with time and entropy is just a measure of how hard it is to distinguish a state from another state (information theoretical view) or how hard it is to find order within a system (thermodynamic view). There are many ways to view entropy...
  7. A

    Entropy Change of Ideal Gas Upon Inserting Wall

    To preface my question, I know it is related to the Gibbs paradox, but I've read the wikipedia page on it and am still confused about how to resolve the question in the particular form I state below. Suppose a completely isolated ideal gas consisting of identical particles is confined to a...
  8. G

    Second Law of Thermo: Is Entropy Decreasing Possible?

    I've always been slightly confused by the Second Law of Thermo. For example, with Maxwell's Demon, where a demon controls the partition between two gas chambers to select all the fast moving particles into one chamber, the Second Law is not violated because the demon's actions and thought...
  9. A

    Why is Entropy Highest at Lowest Energy?

    Why entropy is highest at lowest energy? Entropy is disorderness.So low energy means low disorderness.But why it is highest?
  10. R

    Thermodynamics Power and Entropy

    Homework Statement A compressor processes 1.5kg/min of air in ambient conditions (1 bar and 20ºC). The compressed air leaves at 10bar and 90ºC. It is estimated that the heat losses trough the walls of the compressor are of 25kJ/min. Calculate: a) The power of the compressor b) The...
  11. S

    Entanglement, entropy and photon energy

    1. A photon that emerges when an electron jumps one orbital down -- will have a fixed energy ...i.e. the different between the (potential) energy of the orbitals. However a "free/unbound" photon can have any energy level. Is that correct? 2. What is the lowest level of energy a...
  12. M

    Why does water have higher entropy than helium?

    Hi, Under standard conditions, why does water have higher entropy than helium? Isn't helium a gas? I understand that water has more atoms, but it seems more ordered and is a liquid. I'm not sure how a qualitative analysis could lead to the conclusive result that water is higher in entropy...
  13. C

    Change in entropy of a heat engine

    Homework Statement One end of a metal rod is in contact with a thermal reservoir at 695K, and the other end is in contact with a thermal reservoir at 113K. The rod and reservoirs make up an isolated system. 7190J are conducted from one end of the rod to the other uniformly (no change in...
  14. R

    Entropy Calculation in Thermodynamics Homework

    Homework Statement Calculate the variation of entropy in the following processes: a) Heating of 18 kg of water from 15 to 40ºC at ambient pressure. b) Compression of 9 kg of water from ambient pressure to 7atm at the temperature of 15ºC. Homework Equations ΔS=Cp*ln(T_final/T_initial)...
  15. A

    Entropy, chemical potential, temperature

    For a thermodynamic system there exists a function called entropy S(U,N,V) etc. We then define for instance temperature as: 1/T = ∂S/∂U μ = ∂S/∂N etc. When taking these partial it is understood that we only take the derivative of S wrt the explicit dependece on U,N etc. right? Because...
  16. jk22

    Measurement, information and entropy

    If we have a qubit in a mixed state, let say 1/2(+><+)+1/2(-><-) and we measure it. Is then the result a pure state + or - ? If this is the case, then the entropy of the system decreases. Now the question another way round is : Suppose we measure a quantum system without gaining...
  17. B

    What is translational entropy?

    Hello, I am trying to understand a short literature article (doi: 10.1021/ja01635a030). I am not sure how much liberty I have to reproduce its contents here, and I can't explain it here because I don't understand it -- which is why I have this question. I believe it is proposing that a...
  18. N

    Big Freeze: Entropy or Dark Energy?

    I'm interested in the ultimate fate of the universe. And it seems that the most prevalent theory is the Big Freeze. From what I can gather the BF is caused by dark energy making the universe expand to the point that stars can no longer form, resulting in cold dark space filling the universe...
  19. M

    Shannon Entropy and Origins of the Universe

    Hello Community, I have a question that I'm struggling to get clarification on and I would greatly appreciate your thoughts. Big bang theories describe an extremely low thermodynamic entropy (S) state of origin (very ordered). Question: Is the big bang considered to be a high or low shannon...
  20. Matt atkinson

    Thermodynamics Question, entropy Problem

    Homework Statement Considering entropy as a function of temperature and volume and using the Maxwell relation; $$ \left(\frac{\partial S}{\partial V}\right)_T = \left(\frac{\partial p}{\partial T}\right)_V$$ Show that the entropy of a vessel is given by; $$ S= R...
  21. Z

    Entropy for Reversible and Irreversible Processes

    Hello everyone, I've been reviewing some concepts on Thermodynamics and, even though I feel like I am gaining a level of comprehension about the subject that I could not have achieved before as an undergraduate, I am also running into some situations in which some thermodynamic concepts seem to...
  22. I

    Quantum Entropy & Superdense Coding/Cryptography

    Hi everyone! I have a little problem for an upcoming exams, and I think I need just small hints to solve it. My problem is that I have to write about ten/fifteen pages about SUPERDENSE CODING and QUANTUM CRYPTOGRAPHY, and my professor has taken for granted that these are strongly linked to...
  23. I

    Quantum Entropy and Its Connection to Superdense Coding and Quantum Cryptography

    Quantum entropy and ...?? Homework Statement My problem is that I have to write about ten pages about SUPERDENSE CODING and QUANTUM CRYPTOGRAPHY, and my professor has taken for granted that these are strongly linked to quantum entropy. He never told us why! Indeed he talked about that as...
  24. A

    Is entropy a measure of disorder ?

    Is entropy a measure of "disorder"? In textbooks I never saw a definition of entropy given in terms of a "measure of disorder", and am wondering where this idea comes from? Clausius defined it as an "equivalent measure of change", but I do not see the relation with a concept of "order" or...
  25. S

    Rank Vector Entropy: Role of a Leaky Integrator

    Hi All, I am not sure if this is the right section to post this question but it does involve probability..so please redirect me if necessary. I am currently looking at the Robinson et al. (2013) paper on rank vector entropy in MEG (doi: 10.3389/fncom.2012.00101). Due to my lack of...
  26. binbagsss

    Thermodynamics, engine cycles , entropy, concepts.

    Okay, I am considering a cycle, where the working fluid is an ideal gas, with heat capacities Cv and Cp, the cycle consists of: isochoric increase in volume, adiabatic expansion back to initial pressure and a isobaric compression back to initial conditions. Questions: - q1) I am asked to...
  27. B

    Solution Entropy: H2(g) Standard Values

    Normally due to H+ being the reference state in solution, all 'standard molar' state variables and 'standard value of formation' state variables are 0 for it. But H2(g) has a standard enthalpy of formation = 0 and standard molar entropy of 115 Jmol-1K-1. Then shouldn't ΔG°(298) for the reaction...
  28. tom.stoer

    Boltzmann vs. Gibbs entropy, negative energy

    This article insinuates that the physics community had forgotten Gibbs entropy long ago and has used Boltzmann entropy since. Isn't this nonsense? For me it was always clear that Boltzmann entropy is problematic...
  29. F

    Does this equation exist? (Latent heat, temperature, entropy)

    Evening all. I've seen this equation written down but can not for the life of me find it on tinterweb, is it correct? L = Tconst(S2-S1)
  30. A

    Questions about entropy, normal force, and the human body

    Greetings folks, I'm new to this forum and to physics in general so apologies if I come off like a greenhorn or if I am posting these questions in the wrong place. I have an Arts background and have never really "gotten" science, but my interest in post-Enlightenment philosophy has led me to a...
  31. Y

    How can one calculate entropy? What is entropy?

    From what I've been taught, the entropy of a system is the amount of microstates a macrostate can have. A microstate refers to the configuration of a system on a microscopic level (energy of each particle, location of each particle), a macrostate refers to the external parameters of that system...
  32. I

    What determines the shape of the temperature-entropy graph?

    in my thermo class when we were formalizing the definition of temperature (\frac{1}{T}=\frac{∂S}{∂U}), we drew out all the combinations of various slopes and concavities of the ∂S/∂U graphs. http://imgur.com/cR4V8K8 The shape of this graph i figure should be a reflection of the...
  33. N

    Statistical Mechanics - Change in Entropy

    Homework Statement A system of N distinguishable particles is arranged such that each particle can exist in one of the two states: one has energy \epsilon_{1}, the other has energy \epsilon_{2}. The populations of these states are n_{1} and n_{2} respectively, (N = n_{1}+n_{2}). The system is...
  34. A

    Big Bang Entropy: Constant or Increasing?

    Good night, maybe this is a dumb question, but: how was the entropy before the Big Bang? 1. I know, entropy increases or remains constant, so, was the entropy before the Big Bang, and existence of time an space constant? 2. Why is the entropy in our universe not constant but increasing...
  35. O

    Shannon entropy - use to calculate the bit needed to encode a symbol

    To encode a symbol in binary form, I need 3 bits ,and I have 6 symbols. So I need 6*3=18 bits to encode "We are" into binary form. As shown in http://www.shannonentropy.netmark.pl/calculate My question: 3 bits to encode one then I have to use 16 bits, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _. How to...
  36. S

    Cherry Pie Entropy: Find Final Temp from Heat & Entropy Increase

    Homework Statement When 1.70 ✕ 105 J of heat enters a cherry pie initially at 20.0°C, its entropy increases by 470 J/K. What is its final temperature? Homework Equations The Attempt at a Solution I have no clue
  37. C

    Changes in entropy of an isolated system

    Homework Statement Consider two fixed volume bricks of mass m1=2kg and m2=1kg with initial temperatures T1=400K and T2=100K. They are enclosed in a system thermally isolated from the surroundings and are made from a material with a heat capacity cv = 1kJ/kg/K. A) In Process 1, the bricks are...
  38. E

    Entropy and Endothermic Processes

    I am a general chemistry student and I find thermodynamics fascinating. However, I have a hard time visualizing entropy. Can somebody please explain how an increase in entropy can make a process that is endothermic spontaneous? The typical demonstration of entropy that I have seen is on in...
  39. U

    Entropy generation in chemical reactions

    So from the first law for a closed system, dU=dQ-dW=dQ-PdV From the second law, dS=dQ/T + Sgenerated (i.e. the entropy generated) Putting expression of dQ from second law into first law, dU=T*dS-T*Sgen-PdV If s and v are constant, dU= -T*Sgen>0 Hence dU<0 This is a...
  40. A

    Thermodynamics Entropy Question

    Homework Statement 45g of H2O(g) are condensed at 100 degrees C, and H2O(l) is cooled to 0 degrees C and then frozen to H2O solid. Find the Change in Entropy H2O(l): 4.2 J K-1g-1 vaporization at 100 degrees C: 2258 J g-1; fusion at 0 degrees C: 334 J g Homework Equations dS=dq/TThe...
  41. binbagsss

    Thermodynamics, Entropy, Clausis Inequality, Reversible and Irreversib

    ∫dQ/T≤∫dQ(rev)/T * , where both integrals are evaluated between the same thermodynamic coordinates- A and B , say. - I am having trouble interpreting this inequality. -( I understand the derivation in my textbook via the Clausius diagram(considering a reversible and an ireversible process...
  42. J

    What is the entropy of three cue balls in a bucket?

    I posted this question a couple days back, but it got removed because it looked like a homework question (which, I suppose, is flattering, since I came up with it on the way home from work, and I'm not even a student, let alone a teacher)...so I'm going to try to rephrase it -- but because this...
  43. F

    Calculating entropy, microstate/macrostate probabilities

    Hi all, could somebody look over my answer please. I'm pulled the equation I used off the internet but can't remember where so I'm not sure what it's called. I took a picture of my answer as I thought it would be easier to read than fiddling with symbols here. QUESTION ANSWER ATTEMPT...
  44. F

    Thermal physics - entropy change (latent heat, specific heat etc)

    Hi all, could somebody have a look over my answers for this question please? The value I got for the second part seems quite feeble. Homework Statement The Attempt at a Solution Part a) Key m = mass cs = specific heat bronze cm = specific heat molten bronze Tfus = melting...
  45. T

    Archived What is the entropy of mixing for a system of two monatomic ideal gases?

    I think I actually have solved it. I was right with the PV=nkT, I believe I previously messed up with the algebra. Homework Statement Using the same meathod as in the text, calculate the entropy of mixing for a system of two monatomic ideal gases, A and B, whose relative proportion is...
  46. B

    Fractals- complex but having minimal entropy?

    Fractals are just many iterations of a very basic formula, so they can be described with little information, and yet they are extremely complex given enough iterations. Can they be described as low entropy despite their complexity?
  47. V

    Questions about the derivation of the gibbs entropy

    In statistical mechanics the macro-state of a system corresponds to a whole region in the microscopical phase-space of that same system, classically, that means that an infinity of micro-states relate to a single macro-state. Similarly, given a hamiltonian, a whole surface in the microscopical...
  48. U

    Writing out expression for entropy

    Homework Statement So as a part of a very long question on homework for thermodynamics, I and some of my friends got into a debate about how to write an expression for entropy at a particular state. Homework Equations delS(state 1 to 2)= Cv*ln(T2/T1)+R*ln(V2/V1) (from constitutive...
  49. A

    Does the Ground State of a Bose Gas Have Zero Entropy?

    Hi, can I know what are the ways to show that the ground state particles of a Bose Gas possesses 0 entropy?
  50. U

    Reversible cycles and entropy generation

    I've got a couple of questions about reversible cycles: So if we have two gaseous systems and have a reversible cycle working between them, then the entropy generation within each gaseous system is zero, right? Do turbines execute reversible cycles? Thanks a lot for your help!
Back
Top