What is Entropy: Definition and 1000 Discussions

Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.

View More On Wikipedia.org
  1. R

    Entropy is a measure of energy availiable for work ?

    Entropy is a measure of energy availiable for work ? "Entropy is a measure of energy availiable for work". Can someone explain this to me? Give some examples that show in what sense it is true. It has to come with a lot of caveats, proviso's etc. because its simply not true on its face. I...
  2. A

    Thermo: Dependence of pressure on entropy at a constant temperature

    Homework Statement I was asked to prove that (dP/dS)T (subscript T ie, at a constant temperature) equals κPV ("kappa"PV, or, isothermal compressibility x pressure x volume). By using the Maxwell relation -(dS/dP)T = (dT/dV)P I got an answer of -1/(alpha*volume) but cannot find out how to...
  3. H

    Entropy And Energy Representation

    How do I go from the entropy of a system, S(U,V,N), to its internal energy, U(S,V,N)? For instance, for an ideal classical gas, we have S=(3/2)N*R*ln(U/N) + N*R(V/N) + N*R*c where R is the Boltzmann constant, N is the particle number, V is the volume and "c" is a constant. How can I...
  4. dexterdev

    Algorithms and Sources of entropy for PRNG and TRNG?

    Hi PF, I would like to implement different random number generators using AVR microcontroller (both PRNG and TRNG). So I would like to get suggestions about different sources of entropy for TRNG and algorithms for PRNG. Also wanted to test the randomness. And What is chaos theory...
  5. S

    Understanding Entropy: What Is It?

    What exactly is entropy?
  6. V

    Isentropic Isolated Systems: Understanding the Entropy of the Universe

    For a system that is completely isolated from its surroundings, basic thermodynamics requires that the quasi-static heat flux dQ and the entropy change dS be related by: dQ = TdS and since the system is isolated...
  7. LarryS

    Is Entropy Reduced When Particles Combine?

    Given a system of multiple free electrons. Say 2 of the electrons accidentally collide and become joined (opposite spin) by the weak force. So, the positions of those 2 electrons are now correlated. Was the total entropy of the system reduced by those 2 electrons joining? Thank you...
  8. R

    How can we measure entropy using experiments.

    A friend asks me this. If considering the equation: ∫\frac{dQ}{T}, then it is technically feasible to work out some forms of expressions with measurable physical quantities like temperature and specific heat, therefore it is possible to work out a precise value for entropy change. But is there a...
  9. R

    A demonstration on the necessary positive change in the entropy

    Homework Statement Hello everyone. My problem is as follows: In a spontaneous process where two bodies at different temperatures T_{1} and T_{2}, where T_{1}>T_{2}, are put together until they reach thermal equilibrium. The number of atoms or molecules of the first is N_{1} and N_{2} for the...
  10. N

    What is the entropy if you have a spray can with 30psi?

    I am asking this question because I am trying to understand what entropy is and I just can't seem to get it clear. Now I think the asnwer is 0.5. The pressure of the can is 30. the Atmospheric pressure is around 15. You divide the pressure in the can by the atmospheric pressure and you...
  11. H

    Canonical ensemble, entropy of a classical gas

    Homework Statement I have the equation Z=1/N!h3N∫∫d3qid3pie-βH(q,p) How can I get the entropy from this equation assuming a classical gas of N identical, noninteracting atoms inside a volume V in equilibrium at T where it has an internal degree of freedom with energies 0 and ε What...
  12. R

    How Do We Experimentally Reduce Residual Entropy?

    Homework Statement the problem that i was give as homework, is the question "how do we reduce residual entropy experimentally?" Homework Equations the residual entropy occurs when the calculated molar entropy is greater than measured value, thus S bar calc - S bar exp = residual...
  13. G

    How order and disorder defined for entropy?

    Entropy is the measure of ored and disorder. But who tells that what is order and what is disorder? Isnt it a relative or subjective thing? How to define it in general, or it can be definet only for thermodinamic systems?
  14. A

    Entropy: Heat addition to surrounding.

    If 12007 kJ of heat is lost to the surroundings with an ambient temperature of 25 degrees centigrade during a cooling process, and the ambient temperature of the surroundings is unaffected by the heat addition, what is the entropy change of the surroundings? If Δs=∫δQ/T, then Δs=ΔQ/T=12007...
  15. G

    Definition of entropy of complex systems is existed?

    Common extensive quatities such as mass, charge, volume can be defined for general systems. I can imagine that we can measure and define them without any problem in case of any kind of complex system as well. However, I do not know the general definition of the entropy, only the thermodynamic...
  16. srfriggen

    Does entropy in a closed system always increase OR remain constant (

    Does entropy in a closed system always increase OR remain constant ( in equilibrium ). I have a friend arguing it is ALWAYS increasing. His latest argument was, "if no energy enters or leaves an isolated system, the availability of the remaining energy decreases."
  17. O

    Entropy of a histogram with different bin sizes

    I'm wondering if there's an expression/correction for finding the entropy of a density using histograms of different bin sizes. I know that as the number of bins increases, entropy will also increase (given a sufficient number of data points), but is there an expression relating the two? All I...
  18. 0

    The direction of entropy in the universe

    I'm a little confused about entropy and it's increase along the arrow of time. My perception of entropy is that it is a measure of order in a system and that high entropy represents dissorder. The final result given sufficient time. From what I have read, the universe has began with low and...
  19. N

    Why is the standard entropy of aqueous ions negative?

    Why is the standard entropy of aqueous ions negative? I thought it could be no less than 0, which represents a perfect crystal at 0 K? Is it negative so that calculations can be performed properly? Or is it because it because ions solutes actually have less entropy than a perfect crystal?
  20. tom.stoer

    (gravitational) energy and entropy in an expanding universe

    Reading popular books (written by Hawking, Penrose, Greene, Linde, Guth and certainly many more) one finds numerous statements like entropy was low after the big bang ... Weyl-curvature hypothesis ... entropy increases with time ... black holes violate unitarity and therefore entropy or phase...
  21. D

    About markov source entropy

    Greetings, I want to ask you somthing if i understood well this subject. Lets say we have an order 1 binary source. H(a)=-Paa*log(Paa)-Paa*log(Pab) bit/symbol. From what i understand this is the average information of a symbol generated after an "a", like aa or ab. Is it right?
  22. P

    What Really Is Entropy?

    I just heard about entropy from another thread, so I had to go and google it, I lightly skimmed this wiki page...
  23. J

    A gas in a mini universe reaches maximum entropy

    Assuming a mini-universe with the same laws as our current one. A gas within that universe reaches a state of maximum entropy. Would it remain in that state of maximum entropy once it is reached? Maybe the question does not make much sense. In that case, forgive my ignorance. edit: the...
  24. J

    Entropy of molten lead freezing

    Entropy of molten lead "freezing" Lead melts at 327.5 C.° The latent heat of melting of lead is 24.1 J/g, and the heat capacity of solid lead is 0.14 J/g °C. You take 100 grams of molten lead at a temperature of 327.5 C° and pour it on the sidewalk. The lead freezes and then comes into thermal...
  25. S

    Change of Entropy Calculation - How to do this without Final Pressure Volume?

    Homework Statement I am having trouble working out the change in Entropy. The question is as follows: A mass of 1 kg of air is initially at 4.8 bar and 150 degC and it is enclosed in a rigid container. The air is heated until the temperature is 200 degC. Calculate the change in entropy...
  26. S

    Universe entropy variation of one body and a reservoir

    Homework Statement One body of constant pressure heat capacity C_P at temperature T_i it's placed in contact with a thermal reservoir at a higher temperature Tf. Pressure is kept constant until the body achieves equilibrium with the reservoir. a) Show that the variation in the entropy of the...
  27. E

    Statistical physics: counting states, entropy and temperature

    Hi everyone, I've hit a bit of a snag with part c of this problem (can't figure out how to invert a function T(ν)), so I'm starting to question whether I have the previous parts correct. Homework Statement Consider a system of N identical but distinguishable particles, each of which has a...
  28. K

    Find entropy change for free expantion of ideal gas

    entropy = jouls/kelvin supose 1 liter of ideal gas is allowed to freely expand into a 2 liter volume in an isolated system the energy in the system would remain the same, the temperature in the system would remain the same therefore if entropy =jouls/kelvin the entropy would remain the same...
  29. C

    Find the entropy for the process

    Homework Statement Supercooled water is water that is liquid and yet BENEATH the freezing point. a) A sample of 131 g of supercooled liquid water freezes to solid ice at a temperature of -8.00 ° C. Using the following, Cp,ice = 38.09 J/molK Cp,liquid = 74.539 J/molK fusH° (at T=0 °...
  30. S

    Dirac delta function / Gibbs entropy

    Homework Statement This is an issue I'm having with understanding a section of maths rather than a coursework question. I have a stage of the density function on the full phase space ρ(p,x); ρ(p,x) = \frac {1}{\Omega(E)} \delta (\epsilon(p,x) - E) where \epsilon(p,x) is the...
  31. M

    General Chemistry - Heat, Work, Enthelpy, Entropy Question

    1. Assume that one mole of H2O(g) condenses to H2O(l) at 1.00atm and 95 Celcius. Calculate q, w, ΔH, ΔS of the system, ΔS of surroundings. BUT I AM NOT ASKING HOW TO CALCULATE THESE VALUES, SEE LAST SENTENCE OF POST. Homework Equations q = nCΔT ΔH = n(Cp)ΔT W = -PΔV ΔS = [q_reversible] / T Cp...
  32. M

    Understanding Dissipation and Entropy in Newtonian Dampeners"

    Hello there, I have a question on the dissipation and entropy. Let us consider a Newtonian dampener with viscosity coefficient η, pulled at a fixed rate e', immersed in an infinite bath at temperature T. The mechanical work input in time dt is then dW = ηe'*e'dt, and is all dissipated...
  33. B

    Conceptualizing entropy change of surroundings

    I asked a question on this forum a few days ago about the entropy change of the surroundings, and am grateful for the insight provided. However, something faulty in my conceptualization is preventing me from solving this problem. Let's say you have a set processes shown in the following...
  34. P

    What does the Hydrophobic Effect have to do with entropy?

    Hello, I am taking a biochemistry course right now, and I am so confused by this 'hydrophobic effect' and how it relates to entropy. Hydrophobic effect: THe exclusion of hydrophobic groups or molecules by water. (I get this part!) This appears to depend of the increase in entropy of solvent...
  35. A

    Change in Entropy for Mixing of Two Liquids

    Homework Statement You mix 1 liter of water at 20°C with 1 liter of water at 80°C. Set up an equation to find the change in entropy for one of the volumes of liquid in terms of initial temperature (T1) and the temperature after the two volumes of water mixed (T2) Homework Equations...
  36. M

    Is increase in entropy synonymous with the flow of time?

    Hello, I was just reading how the second law of thermodynamic is the only principle of physics that is irreversible, and that fact appears to have a correlation with how we view time and how time can only move in one direction, being the same as the flow of entropy. So, are entropy and time...
  37. B

    Entropy change of the surroundings during irreversible process

    According to my textbook, during an irreversible process, the entropy change of the surroundings is given by \frac{q}{T} where q is the heat transferred to the surroundings during the process. Why are we allowed to use this equation, considering that this equation only holds for reversible...
  38. H

    Finite amount of degree of freedom for entropy available in universe (?)

    The spectrum of the Cosmic Microwave Background radiation - the flash of the Big Bang, aligns almost precisely with the shape of the Black Body radiation curve. This means that the CMB radiation came from a state that was in thermal equilibrium. Since thermal equilibrium is a state of maximum...
  39. B

    Entropy change in a reversible adiabatic process

    For a reversible process, I imagine it is correct to say that dS = \frac{dq}{T} where all quantities refer to system quantities (not the surrounding). However, for an adiabatic process, dq = 0 . Thus, should it be the case that for an adiabatic reversible process, dS =...
  40. B

    Thermal Physics Introduction- solving for entropy and temperature change

    Homework Statement A container is divided into two parts by a thermally conducting wall. There are N atoms of a monatomic ideal gas on the left side, 2N on the right. The gas on the left is initially at absolute temperature 200K, the gas on the right at 500K. a. After thermal equilibrium...
  41. J

    Solving the Entropy Puzzle: Blundell and Blundell Q&A

    Hello, This is a question I've been working on out of blundell and blundell, http://imageshack.us/a/img560/3342/entwopy.jpg The red box is my answers to the question which I am pretty sure are ok. I am having trouble with the very last part of the question. By the logic of the...
  42. O

    Entropy of isothermal process reversible\irreversible

    We were shown in class how to get those entropys. For reversible isothermal - ΔT=0 thus ΔE=0 thus Q = -W. ΔS(sys) = Qrev/T = nR(V1/V2) And ΔS(surr) = -nR(V1/V2) because surroundings made opposite work. For irreversible isothermal in vacuum - ΔT=0 thus ΔE=0. No work is done by...
  43. S

    What is the definition of entropy in SM?

    We physicists must be careful to insure that theories begin with correct principles. One basic principle is that all quantities must be capable of being observed or measured. If a theory uses a quantity that cannot be observed, then it is not a physics theory, but a hypothesis or a...
  44. L

    Black hole temperature derived from entropy (heat from black hole?)

    Hey, The entropy of a black hole is S = kB (4∏GM2)/(hbar c) S=Q/T T= Q/S T = Q (hbar c)/ (4∏GM2kB) The temperature derived from hawking radiation is: T = c3 hbar/ (8 pi G M kB) Which implies Q = (1/2)M c2 Is this true? I have found online that the heat should equal...
  45. M

    Archived Entropy Change of an Ideal Gas

    Homework Statement Using the expression for the entropy change of an ideal gas when the volume and temperature change and TV\gamma-1 = constant, show explicitly that the change in entropy is zero for a quasi-static adiabatic expansion from state V1T1 to state V2T2.Homework Equations TV\gamma-1...
  46. X

    Calculating Entropy Change in Niagara Falls Waterfall

    Homework Statement Every second at Niagara Falls, some 5.0 10^3 m3 of water falls a distance of 50.0 m. What is the increase in entropy per second due to the falling water? Assume that the mass of the surroundings is so great that its temperature and that of the water stay nearly constant at...
  47. G

    Entropy & Relativity: What's the Relationship?

    Since the entropy increase in a system is a function of time, it would seem that for different observers, rates of entropy would move differently. I am struggling a little hear with putting this into a coherent question, as I am a layman, but the jist is: both the notion of a closed system and...
  48. J

    Entropy Change from Joule Expansion

    Assume we have an ideal gas of N particles inside a thermally isolated cylinder of volume V, and that the cylinder is equipped with a piston that can trap the air on one side. (Assume the piston occupies no volume in the cylinder.) Initially, the piston is fully withdrawn and the gas occupies a...
  49. A

    Concept of Entropy. (Question)

    I have a question about the increasing behavior of entropy. Suppose we heat a metal (Take for instance Fe) till it radiates energy and then put it in space (No medium). So the metal radiates energy in electromagnetic waves which decreases the entropy of the metal (Due to decrease in internal...
  50. 1

    Does entropy decrease during the formation of stars and the freezing of water?

    Does entropy decrease when stars are created from a dust? Does etropy decrease when water freezes?
Back
Top