What is Entropy: Definition and 1000 Discussions

Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.

View More On Wikipedia.org
  1. Titan97

    Entropy of system and surroundings

    I have some doubts on entropy change of certain simple process. Can you check if these statements are correct? This is what I know: For a reversible adiabatic process, $$\Delta Q=0$$. $$\Delta S_{system}=\frac{\Delta Q}{T}=0$$. Since the system does not alter the surroundings, ##\Delta...
  2. sara lopez

    Calculate the entropy changes of the system and the surroundings

    Homework Statement Calculate the entropy changes of the system and the surroundings if the initial and final states are the same as in part a ( part a= 2.000 mol of neon (assume ideal with CV,m _ 3R/2) is expanded isothermally at 298.15K from 2.000 atm pressure to 1.000 atm pressure and is...
  3. G

    Is there a way to calculate the entropy change for an irreversible process?

    If we consider a system to undergo an irreversible process from state 1 to state 2 and a reversible process from state 2 to state 2, then through Clausius inequality (1to2∫dQirrev/T) + (2to1∫dQrev/T) ≤ 0 (1to2∫dQirrev/T) + s1 - s2 ≤ 0 s2-s1 ≥ (1to2∫dQirrev/T) Δs ≥ (1to2∫dQirrev/T) Does this...
  4. L

    Entropy Calc: Solving for 100 Mol Gas

    Homework Statement Two isolated containers, of volumes V1 and V2, enclose ideal single atom gas at the same pressure p. The number of particles in each container is equal, the temperature of gas in container one is T1=293K and the temperature of gas in container two is T2=308K. An equilibrium...
  5. N

    Entropy of mixing - Ideal gas. What is x?

    Homework Statement A bottle with volume v containing 1 mole of argon is next to a bottle of volume v with 1 mole of xenon. both are connected with a pipe and tap and are same temp and pressure. the tap is opened and they are allowed to mix. What is the total entropy change of the system? Once...
  6. V

    Doubt from second law of thermodynamics

    qrev/T = ΔS here what does ΔS signify?does it mean change in entropy of system or surroundings? how is entropy of system,surrounding and universe related to each other and which entropy is used in gibbs free energy equation?
  7. I

    Internal energy vs. Enthelpy vs. Entropy

    What is the difference between Q=m(u2-u1) + W & Q=m(h2-h1)? Basically I am trying to figure out 2 different sets of questions and apparently using these separate equations yield different answers, and I don't know which equation to use. From my understanding, both of them are used in...
  8. weezy

    I Doubt regarding proof of Clausius Inequality.

    I have attached two images from my textbook one of which is a diagram and the other a paragraph with which I am having problems. The last sentence mentions that due to violation of 2nd law we cannot convert all the heat to work in this thermodynamic cycle. However what is preventing the carnot...
  9. R

    Does Entropy Change in All Adiabatic Processes?

    For all adiabatic processes the entropy of the system does not change (speaking in general) Is this statement correct?
  10. W

    Maximum value of Von Neumann Entropy

    Homework Statement Prove that the maximum value of the Von Neumann entropy for a completely random ensemble is ##ln(N)## for some population ##N## Homework Equations ##S = -Tr(ρ~lnρ)## ##<A> = Tr(ρA)## The Attempt at a Solution Using Lagrange multipliers and extremizing S Let ##~S =...
  11. Seanra

    I Could "reverse entropy stars" exist in our universe?

    My lecturer claimed that "reverse entropy stars" could exist in our universe. One of the examples he gave was that if you exposed some sort of detector in the direction of a hypothesized reverse entropy star, you could determine if it existed by whether it "sucked" photons out of the detector...
  12. Stephanus

    Understanding Entropy and Hawking Radiation in Black Holes

    Dear PF Forum, I'm trying to make sense about Hawking radiation in Black Hole. And that leads me into entropy. I read this equation in https://en.wikipedia.org/wiki/Entropy What does that mean? S is the change of Entropy What does Qrev mean there? Is it in Calorie? then Joule? T, I think is in...
  13. C

    Mean field approximation and entropy

    Homework Statement Consider a D dimensional Ising model with N sites, defined by the Hamiltonian $$\mathcal H = -J \sum_{\langle i j \rangle} \sigma_i \sigma_j - h \sum_i \sigma_i$$ where the sum extends over nearest neighbours and each spin variable ##\sigma_i = \pm 1##. For a given spin...
  14. RoboNerd

    Question on entropy in adiabatic phase change

    Homework Statement Consider a closed, adiabatic system consisting of a mixture of liquid and solid substance Z at equilibrium at its melting point. Z (solid) <---------> Z (liquid) Which of the following statements is true regarding the system? A) The entropy of the system is at a maximum...
  15. H

    B Entropy is greater in galaxies or pre atomic era

    1. I understand an expanding gas has increasing entropy and at a cosmic scale the universe is an expanding gas...sort of. 2. back before the universe was cool enough to form atoms it would seem to be very disordered, ie a high temperature universe of a plasma made of nuclei and elementary...
  16. ChrisVer

    Python Entropy calculation atoms simulation

    I have the following program that simulates a stochastic system of 400 particles. What I want to do now is separate the 200x200 grid into 2x2 smaller grids, out of which I will calculate each probability: P_i= \frac{\sum_{\text{atoms}}}{4} and from which in each step I'll be able to determine...
  17. Z

    Thermodynamics: Irreversible process and entropy

    Homework Statement Hi ! I'm stuck with these two questions of my assignment of thermodynamics - Give two exemples of irreversible process (initial state, process, final state) - For each of them, explain why they are irreversible on the microscopic scale. Homework Equations We are not asked...
  18. P

    Kolmogorov Entropy: Intuitive Understanding & Relation to Shannon Entropy

    Dear all, I'm trying to have an intuition of what Kolmogorov Entropy for a dynamical system means. In particular, 1. what is Kolmogorov Entropy trying to quantify? (what are the units of KS Entropy, is it bits or bits/seconds). 2. What is the relation of KS entropy to Shannon Entropy? 3. In...
  19. Henry Stonebury

    Thermodynamics: Pressure and temperature from turbine

    Homework Statement A turbine is receiving air from a combuster inside of an aircraft engine. At the inlet of the turbine I know that T1 = 1273 K and P1 = 549 KPa, and the velocity of the air is essentially 0. The turbine is assumed to be ideal, so its efficiency is exactly 1. Also: R = 287...
  20. B

    What happens to entropy when kinetic energy increases in a system?

    Entropy is basically a measure of the number of avaible microstates a system can have, given a certain energy of the system. It is a measure of the uncertainty of the exact state of the system. Now, suppose we have a box with a single particle inside and with the only internal energy being the...
  21. D

    Entropy Thermodynamics: Calculate ∆S for 1 Mol of Diatomic Gas

    Homework Statement A sample consisting of 1 mol of a diatomic perfect gas with Cv,m = 3/2 R is heated from 100 ºC to 300 ºC at constant pressure. Calculate ∆S for the system. Homework Equations Cv,m = 3/2 R The Attempt at a Solution Cpm=Cvm +r because we want cp right isobaric ∆S= Cp ln...
  22. Peter25samaha

    Entropy or Quantum Mechanics ?

    is entropy problem easiest than quantum mechanics problem which one is more complicated to understand and to solve ?
  23. F

    B Quantum to Classical Particles: Understanding the Entropy Limit

    I have heard that identical distinguishable classical particles having different ''statistics''.It is the limit of quantum case.Then we mix many parts(cells) of identical gases, the total entropy increases.I can not derive this limit from quantum particles to classical particles(please help...
  24. V

    Help understanding Shannon entropy

    Hi I'm having some trouble understanding Shannon entropy and its relation to "computer" bits (zeros and ones). Shannon entropy of a random variable is (assume b=2 so that we work in bits) and everywhere I've read says it is "the number of bits on the average required to describe the random...
  25. M

    Find entropy change when two tanks equalize

    Homework Statement Two rigid, insulated tanks are connected with a pipe and valve. One tank has 0.5 kg air at 200 kPa, 300 K and the other has 0.75 kg air at 100 kPa, 400 K. The valve is opened, and the air comes to a single uniform state without any heat transfer. Find the final temperature...
  26. J

    I Does Anti-Information Exist in Quantum Theory?

    Is there any theory that says anti-information exists? If there is anti-matter, would that matter carry information to annihilate the regular matter's information saying its a certain type of matter and turn it into energy? Could anti-matter just be regular matter with anti-information.
  27. S

    Is Dropping a Metal Block into a Lake a Reversible Process in Thermodynamics?

    Hello all. I have a quick question about entropy... I've just been formally introduced to it. Consider the example of a metal block of mass m and heat capacity Cp at temperature T1 = 60C being dropped into a large lake of temperature T2 = 10C. $$\Delta S_{block} =...
  28. E

    Entropy of a chemical reaction

    Homework Statement The entropy change of surrounding is greater than that of the system in a/an (A) exothermic process (B) endothermic process (C) both (A) and (B) are correct (D) none of these are correct Homework EquationsThe Attempt at a Solution ΔStot = ΔSsys + ΔSsurr For a spontaneous...
  29. A

    Use of Entropy for a Control Volume in Energy Balance

    Hi all, I'm having some trouble figuring out why entropy is used instead of enthalpy for an open system. From what I understand, an open system uses entropy to calculate internal energy. Since the control volume is constant (i.e. Δv = 0), wouldn't using : h = u + PΔv effectively be h = u? So...
  30. S

    Entropy vs. enthelpy in chemical reactions.

    Hello, I am learning about using Free energy change /delta G to determine if a chemical reaction will occur spontaneously. /delta G = /delta H + T*/delta S. Now, enthalpy change can drive a reaction which leads to a decrease in entropy (multiple reactants => single product). My Question...
  31. K

    A Does Minimal Black Hole Entropy Suggest a Fundamental Spacetime Structure?

    If we plug the Planck mass into the Bekenstein-Hawking formula for the BH entropy, we'll get S = A/4l^2 = 4πGM^2/cħ = 4π ≈ 12.56 nat for the minimal Schwartzschild black hole. If we assume that each entropy unit is a compact area on the horizon, can we consider the minimal BH a dodecahedron...
  32. E

    Required heat, entropy change of object dropped in water

    Homework Statement A 1 liter container is filled with argon to pressure of 10^5Pa at 303K. Dropped into pool at temp 323K. How much heat is needed to heat the gas to 323K? what is the entropy change in the gas and the universe? ignore the entropy change in container. Homework Equations...
  33. J co

    General Extensivity of Entropy

    Homework Statement Which of the following are not extensive functions: S[1] = (N/V)[S[0]+[C][v] ln(T) + R ln(V)] S = (N)[S[0]+[C][v] ln(T) + R ln(V/N)] S[3] = ([N])[S[0]+[C][v] ln(T) + R ln(V/N)] 2. Homework Equations I'm not really sure how to approach this problem. The definition that I...
  34. S

    Entropy change in a reversible isothermal process

    Why does ∆S = 0 for a reversible process, but for a reversible isothermal process, ∆S is given by nRln(Vf/Vi) (or other variations of that equation)?
  35. Philip Koeck

    Why can one calculate entropy change for thermal conduction?

    A hot object in thermal contact with a cold one will finally reach a temperature in between. Why can the entropy change of each object be calculated as if the process was reversible? Is there a reversible process with the same final and initial state and what would that be?
  36. N

    Enthelpy at the throat of the nozzle

    Hi, This is my first post in this forum :woot: I have a total enthalpy ht=3000 kj/kg with velocity at inlet v1=20 ms-1, Speed of sound a1=562.5 ms-1 and outlet pressure P2 = 1 bar With the formula ht=h*+(a*2/2) how do i calculate actual value at throat h* ? NOTE: (by trial and error) I can...
  37. R

    Entropy of a Gas Under Pressure: Does Temperature Trump Pressure?

    Can anyone please answer this question? I have read that increased temperature increases entropy and increased pressure decreases entropy ,for a gas.And vice versa.decreased temperature decreases entropy and decreased pressure increases entropy.Can anyone please tell me for a gas under pressure...
  38. R

    Gibbs free energy -- mathematical expression

    I am not able to understand the mathematical expression of "change in Gibbs free energy", For a chemical reaction occurring at constant temperature and constant pressure, (ΔS)total = (ΔS)system + (ΔS)surrounding Considering that reaction is exothermic, ΔH be the heat supplied by system to...
  39. S

    Time, Spacetime & The Arrow of Entropy

    Physicists refer to "spacetime", lumping together the dimensions of X, Y, Z, and T as if they're all common and same. This reductionism is the product of mathematical rigor. But in our daily lives, we don't experience T in the same way we experience X, Y, and Z. I can arbitrarily set the...
  40. O

    Irreversible adiabatic process - is the entropy change zero?

    Homework Statement A well insulated container consists of two equal volumes separated by a partition - one half an ideal gas while the other is a vacuum. The partition is removed, and the gas expands. What is the entropy change per mole? Homework Equations dS = dQrev/T S/R = Cp/R dT/T -...
  41. entropy1

    Why is entropy not reversible?

    Is there an easy way to explain in layman terms why entropy in an open system is not reversible?
  42. gloppypop

    Power Consumption and Entropy Generation

    Homework Statement [/B] 2-3-15 Homework Equations [/B] P = power. W = work. U = internal energy. S = entropy. t = time. Q = heat. T = temperature. F = force. d = distance. P = ΔW/Δt= ΔU/Δt ΔS = ΔQ/T dm/dt = ρ⋅dV/dt W = F ⋅ d ΔU = Q - W Where m is mass, V is volume, and ρ is the...
  43. C

    Conceptual explanation for ΔS=Q/T

    So understand how to solve a problem using the equation: ΔS=Q/T But is there a conceptual explanation for why the equation works. And I'm not looking for a proof, but I simply want to understand why this equation should be intuitive. Thanks for your time!
  44. A

    Can the entropy be reduced in maximization algorithms?

    In maximization algorithm like that is used in artificial intelligence, the posterior probability distribution is more likely to favour one or few outcomes than the prior probability distribution. For example, in robots learning of localization, the posterior probability given certain sensor...
  45. zawy

    Is the Moon responsible for reducing Earth's entropy and making life possible?

    Spontaneous negative entropy reactions can occur when the internal energy decrease is greater than the negative dS*T. dG=dU-dST is spontaneous if dG is negative. The moon is receiving at least 1E8 J/s from the loss of rotational energy from the Earth's water and air. A lot more rotational...
  46. D

    Confusion about [itex]T[/itex] in the definition of entropy

    In the derivation of the Clausius inequality, T is the temperature of the reservoir at that point in the cycle, but in the definition of entropy it becomes the temperature of the system. This seems to work for a Carnot cycle, where the two are the same, but for other processes, such as an object...
  47. T S Bailey

    Is Shannon Entropy subjective?

    If you have multiple possible states of a system then the Shannon entropy depends upon whether the outcomes have equal probability. A predictable outcome isn't very informative after all. But this seems to rely on the predictive ability of the system making the observation/measurement. This...
  48. B

    Calculating microstates and entropy

    Homework Statement Two identical brass bars in a chamber with perfect (thermally insulating) vacuum are at respective temperature T hot>T cold. They are brought in contact together so that they touch and make perfect diathermal contact and equilibrate towards a common temperature. We want to...
  49. mfig

    I Why do we see the claim that an isentropic process is adiabatic and reversible?

    Why do we always see the claim that an isentropic process for a system is adiabatic and reversible? The change in entropy for a process is the sum of the entropy transfer accompanying heat and the entropy production. The entropy production term is always at least zero, and the transfer term...
  50. A

    Why define entropy with heat instead of work?

    From what I understand, in the Carnot cycle summing qi/Ti for each step results in zero, thus indicating a new state function, entropy = qrev/T. But since dE = 0 = q+w, then q = -w, and looking at the equations derived from the cycle summing wi/Ti for each step should also result in zero. So why...
Back
Top