entropy Definition and Topics - 246 Discussions

Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.

View More On Wikipedia.org
  1. Yash Agrawal

    Thermodynamics of chemical reactions

    In chemical reactions generally ΔG < 0 , but if we were to consider a reversible path between pure reactants and products at 1 bar pressure , shouldn't the ΔG = 0 for every reaction ? and if it is due to non-pv work , I don't see any non pv work being done in reactions happing in a closed...
  2. EFech

    Protein Folding Entropy

    I have been reading about protein thermodynamics and found different types and models for entropy calculation before and after protein folding. I understand Vibrational, conformational, configurational entropy are some of the most studied "types" of protein folding entropy. My questions is...
  3. L

    Number of moles necessary to get piston back to initial position

    a) ##T_A=\frac{p_AV_A}{nR}=300.7K, P_A V_A=kL^2=nRT_A##, ##P_B S=k\frac{L}{2}\Rightarrow P_B V_B=k(\frac{L}{2})^2 \Rightarrow P_B=\frac{kL^2}{2V_A}=\frac{P_AV_A}{2V_A}=\frac{P_A}{2}##, ##W_{spring\to gas}=\int_{L}^{L/2}kxdx=-\frac{3}{8}kL^2=-\frac{3}{8}nRT_A####\Rightarrow Q=L+\Delta...
  4. L

    Finding the increase in entropy of the universe in gas expansion

    a) ##P_f=\frac{nRT_f}{V_f}=\frac{nR\frac{T_i}{2}}{2V_0}=\frac{1}{4}\frac{nRT_i}{V_0}=\frac{1}{4}P_i## b) ##Q=\Delta U=nC_V \Delta T=n\frac{5}{2}R(-\frac{T_i}{2})=-\frac{5}{4}nRT_i=-\frac{5}{4}P_i V_0## (##L=0## since the gas expands in a vacuum; Now, (a) and (b) are both correct but not (c)...
  5. alwaystiredmechgrad

    I Defect concentration formula w/o Stirling approximation

    In many cases, the concentrations of defects or charges are quite big enough to use SA, due to a big number of Avogadro's number. The derivation for the well-known formula of a defect concentration is followed. If the n_v is expected to be lower than 1, then it would be impossible to use SA...
  6. R

    I Why does entropy grow when a solar system is formed?

    On page 50 of "From eternity to here", Sean Carroll writes that the protostellar cloud had a lower entropy than the solar system it produced. That strikes me as odd. A solar system looks more arranged than a dust cloud. When talking about entropy, someone always mentions the milk in the coffee...
  7. Ebi Rogha

    B Is time a consequence of 2nd law of thermodynamics?

    I have heard from a knowledgable physics proffessor, time exists independently and it is not a consequence of arrow of time. Could some body explain this?
  8. C

    I Entropy and Heat Capacity Relation

    I have a simple question sort of about exact differentials and deciding which variables matter and when. I know we can write entropy ##S## as ##S(P,T)## and ##S(V,T)## to derive different relations between heat capacities ##C_V## and ##C_P##. I was wondering if it is technically correct to...
  9. A

    Calculating work and heat transfer in this Carnot process

    Hey guys! This is problem from Callens Thermodynamics textbook and I'm stuck with it. My goal was to get a expression for the entropy ##S## which is dependent on ##T## so I can move into the ##T-S##-plane to do my calculations: I startet by expressing the fundamental equation as a function of...
  10. Kaguro

    Bose gas versus Classical gas

    In classical statistics, we derived the partition function of an ideal gas. Then using the MB statistics and the definition of the partition function, we wrote: $$S = k_BlnZ_N + \beta k_B E$$, where ##Z_N## is the N-particle partition function. Here ##Z_N=Z^N## This led to the Gibb's paradox...
  11. A

    Calculating specific heat capacity from entropy

    Hey guys! I'm currently struggling with a specific thermodynamics problem. I'm given the entropy of a system (where ##A## is a constant with fitting physical units): $$S(U,V,N)=A(UVN)^{1/3}$$I'm asked to calculate the specific heat capacity at constant pressure ##C_p## and at constant volume...
  12. J

    Calculate change in entropy per minute.

    So what I did was find the change in Q per min. Mass melted per min * latent heat capacity = Q per min = 11.5 kg /min * 3.4*10^5 J/kg = 3910000 J/min Now the equilibrium temperature is 100 degrees Celsius or 373.15 degrees kelvin. If I do 3910000 J/min / 373.15 K I get 10478 J/(K*min). This...
  13. timjdoom

    Definition of a system in Boltzmann entropy

    Context Boltzmann first defined his entropy as S = k log(W). This seems to be pretty consistently taught. However, the exact definitions of S & W seem to vary slightly. Some say S is the entropy of a macrostate, while others describe it as the entropy for the system. Where the definition of...
  14. bardia sepehrnia

    I How can a process be isentropic but not reversible or adiobatic?

    In the book for our thermodynamics, it states that a process that is internally reversible and adiabatic, has to be isentropic, but an isentropic process doesn't have to be reversible and adiabatic. I don't really understand this. I always thought isentropic and reversible mean the same thing...
  15. mcas

    Find change in entropy for a system with a series of reservoirs

    I've calculated the change in the entropy of material after it comes in contact with the reservoir: $$\Delta S_1 = C \int_{T_i+t\Delta T}^{T_i+(t+1)\Delta T} \frac{dT}{T} = C \ln{\frac{T_i+(t+1)\Delta T}{T_i+t\Delta T}}$$ Now I would like to calculate the change in the entropy of the...
  16. mcas

    Find the change in entropy for an ideal gas undergoing a reversible process

    We know that $$dU=\delta Q + \delta W$$ $$dU = TdS - pdV$$ So from this: $$dS = \frac{1}{T}dU + \frac{1}{T}pdV \ (*)$$ For an ideal gas: $$dU = \frac{3}{2}nkdT$$ Plugging that into (*) and also from ##p=\frac{nRT}{V}## we get: $$S = \frac{3}{2}nk \int^{T_2}_{T_1} \frac{1}{T}dT +...
  17. E

    Stability and concavity of the entropy function

    I am struggling to understand Callen's explanation for stability, I understand that the concavity of S(U) must be negative because otherwise we can show that this means that the temperature increases as the internal energy decreases (dT/dU<0) but I cannot understand equation (8.1) which...
  18. Amaterasu21

    I How does the relativity of simultaneity square with thermodynamics?

    In special relativity, observers can disagree on the order of events - if Alice thinks events A, B and C are simultaneous, Bob can think A happened before B which happened before C, and Carlos thinks C happened before B which happened before A - provided A, B and C are not causally connected, of...
  19. iVenky

    I Maxwell's Demon and the Uncertainty Principle

    Maxwell's demon measures the position and velocity of the particle. How can it do that when it violates the uncertainty principle? Does that mean uncertainty principle is unavoidable otherwise we will violate II law of thermodynamics as in the case of Maxwell's demon?
  20. blazh femur

    Is randomness real or the inability to perceive hyper complex order?

    How did you find PF?: random Brownian motion Is randomness real or is it simply defined as such due to our inability to perceive hyper complex order? Randomness is a troublesome word. I'd feel better if I knew it was an objective phenomenon and not merely a placeholder description of...
  21. micklat

    I Explaining atoms and bonding using entropy

    I am a biology undergraduate interested in abiogenesis. The entropic explanation for the origin of life is that life is allowed to exist because it increases universal entropy. I am curious about how far we can take this theory down. How can you explain the emergence of atoms and atomic...
  22. A

    Is Entropy decreased for Free expansion of a Waals gas?

    Previous of this problem, there was another problem. that is "What is the change in Temperature of van der Waals gas in free expansion?". I got them. It was C_V dT= -aN^2/V^2 dV Then, I got T=T0-aN^2/2VC_V So i knew that the Temperature is decreased by free expansion in adiabatic process. Then I...
  23. A

    Von Neumann Entropy time derivative(evolution)

    I'm not sure about my proof. So please check my step. I used log as a natural log(ln). Specially, I'm not sure about "d/dt=dρ/dt d/dρ=i/ħ [ρ, H] d/dρ" in the second line. and matrix can differentiate the other matrix? (d/dρ (dρ lnρ))
  24. P

    Entropy and the Helmholtz Free Energy of a Mass-Piston System

    Attempt at a Solution: Heat Absorbed By The System By the first law of thermodynamics, dU = dQ + dW The system is of fixed volume and therefore mechanically isolated. dW = 0 Therefore dQ = dU The change of energy of the system equals the change of energy of the gas plus the change of energy...
  25. forkosh

    I Von Neumann entropy for "similar" pvm observables

    The von Neumann entropy for an observable can be written ##s=-\sum\lambda\log\lambda##, where the ##\lambda##'s are its eigenvalues. So suppose you have two different pvm observables, say ##A## and ##B##, that both represent the same resolution of the identity, but simply have different...
  26. T

    What is the change in entropy for a colloid settling out of solution?

    If it occurs spontaneously then it must increase entropy but the possible micro states reduce so what else is occurring to increase entropy
  27. Saptarshi Sarkar

    Meaning of thermodynamic probability

    I was studying statistical mechanics when I came to know about the Boltzmann's entropy relation, ##S = k_B\ln Ω##. The book mentions ##Ω## as the 'thermodynamic probability'. But, even after reading, I can't understand what it means. I know that in a set of ##Ω_0## different accessible states...
  28. Y

    Entropy change when melting ice then refreezing the water

    ##dmL_f= Q \; \; ##,##∆T=\frac{T(v_l-v_s)∆P}{L} \; \;##,##\frac{dmL_f}{T_0}= dS_2 \; \;##,##\frac{dmL_f}{T_1}= dS_1 ##
  29. Rahulx084

    Thermodynamics -- Temperature of a Heat Source?

    In heat engine we define a heat source from where heat is transferred to the system, we say that heat source has a temperature ##T_h## , When we define a Carnot heat engine, the first process we have is an isothermal expansion and we say heat has to come in system through this process and here...
  30. Hawzhin Blanca

    A Many worlds, observer and Entropy

    According to Everett-interpretation or many world interpretation of quantum mechanics, each decision an observer makes, the world splits into two parallel universes, let’s say an observer in some point in Spacetime is tests the Schrödinger’s cat experiment, in one branch of the universe the cat...
Top