Entropy and enthelpy difference

In summary, entropy is a measure of the number of possible microscopic states that a system can take on, with the most likely state being the one with the highest entropy. It is related to the second law of thermodynamics, which states that entropy tends to increase in a closed system. The concept of disorder and energy dispersal are often used to explain entropy, but they can be flawed and subjective. It is best to think of entropy in terms of microstates and how they change in a system as it undergoes processes such as heating, cooling, and work.
  • #1
Mr_Bojingles
79
0
I decided to try and learn what entropy is today and I swear to god I've been sitting here for 4 hours and I still don't have the foggiest idea of what the hell is it? Its driving me insane. I can't think anymore because of the stress that's building up from the fact that I just can't comprehend the concept.

What I've read is that its ties into the second law of thermodynamics and that basically it is the measure of the forces that tend to disperse molecules or heat and distribute is uniformly in a closed system. That makes perfect sense to me.

Heres where the contradictions start. Other sites say that entropy is the measure of disorder and that nature tends to go towards an unorganized, disordered state. Personally I see the dispersion of matter or energy to achieve a uniform state as organized and ordered. Theres nothing disorganized about that.

What am I missing here? I can't make any sense of the explanations on the internet. Some of them say if you have 2 metal blocks and one is hotter than the other. Let's say block 1 is hot and block 2 is cold. They say that if heat transfers from block one to block 2 that the entropy of block 1 rises while the entropy of block 2 decreases. If that's the case I have no idea what entropy is.

I'll be too pissed off to go to bed unless I understand the concept and the way things are looking I'm not going to be sleeping tonight. Can anyone help me understand the concept? If I can just figure out what entropy is I might be calm enough to go to sleep. I'll leave enthalpy for another day.
 
Last edited:
Science news on Phys.org
  • #2
Technically,

[tex]S = kln\Omega[/tex]

Where k is the boltzmann's constant and [tex]\Omega[/tex] is the multiplicity, or, the number of microscopic states the system can take on that have the same energy. It's easy to figure out what [tex]\Omega[/tex] is for simple systems, and one generally finds that the the multiplicity function is sharply peaked, it's really, really unlikely to find a system far away from the state of maximum entropy. So, if a system starts in a low entropy state (a block of ice), it will tend to go to a higher entropy state (a puddle). The second law of thermodynamics isn't really a physical law in the sense of [tex]F = Ma[/tex], but the statistics always work out such that the probability of the entropy not increasing isn't even worth considering.
 
  • #3
Entropy is a difficult concept to understand, no doubt about it. It's a "something" that seems to pass from one body to another and can apparently be generated from nowhere.

Forget about the disorder and the dispersal analogies, they're too flawed to work for you. The most fundamental definition of entropy that I know of is [itex]S = -k \sum p_i \ln p_i[/itex] where [itex]p_i[/itex] is the probability of the system being in microstate [itex]i[/itex] (a microstate is when each particle has an assigned quantum state that is compatible with macroscopic observables like temperature and pressure). If all microstates are equally probable, we have the familiar [itex]S=k\ln \Omega[/itex], where [itex]\Omega[/itex] is the number of microstates. The best description that I've heard of the 2nd law is that entropy (= number of microstates) tends to increase, and that entropy is maximized when a system is at equilibrium.

Here is the problem with the disorder argument: disorder is subjective. Who's to say whether a messy room is randomly disordered or whether each items has been carefully positioned, with definite rules, by its owner? Additionally, disorder on the microscale and on the macroscale can be difficult to identify. Alluding to your point, a glass of crushed ice looks disordered, while a glass of water looks ordered. However, the glass of water has far more entropy.

Here is the problem with the energy dispersal argument: Consider two rings of pure material spinning in opposite directions at a very low temperature, arbitrarily close to absolute zero. The system velocity and angular momentum are zero; the only important number is the rotational speed. There are very few possible microstates (tending to one as we approach absolute zero) that are compatible with the system, because random atomic motion is nearly eliminated due to the low temperature. Each atom is pretty much limited to following its circular path with essentially no thermal vibration. The entropy is very low, almost zero.

If the rings are now brought into contact, they will eventually slow each other to a stop by friction. Now the rotational speed is zero and the material is hotter, say at some temperature [itex]T[/itex] well above absolute zero. There is now a huge number of possible microstates, because the random thermal energy could be apportioned to the particles in an enormous number of combinations without us ever knowing the difference. (It doesn't matter whether atom #45,391,567,958,... is going fast and #198,562,994,261,... is going slow or vice versa, as long as the energies add up to put the bulk material at temperature [itex]T[/itex].)

This is where I have a problem with "energy dispersal." The energy isn't more disperse after we connect the rings. The energy didn't go anywhere, the system is closed. Neither has the energy spread out spatially. The average energy of the particles is still the same. I think the dispersal definition falls short here, while the microstates definition explains the spontaneity of the process with no problems.

So I encourage you to think in terms of microstates, as nathan12343 pointed out above. When we heat a system, its number of microstates increases. When we cool a system, its number of microstates decreases (but the energy we removed increases the number of microstates of the surroundings). We we do work on a system, there is no change in the population of microstates. The number of possible microstates in the entire universe tends to increase (this is the Second Law).

It may be useful to think of entropy as something that "flows", but you have to be careful (nothing is actually flowing). Entropy is the conjugate extensive variable to temperature, just as volume is the conjugate extensive variable to pressure. Just as two systems will exchange volume in order to equalize their pressure, two systems will "exchange" entropy to equalize their temperature. But what is really happening is that energy is being transferred, increasing the number of possible microstates in one system while decreasing the possible number in another.

Finally, you should know that entropy is conserved for reversible processes, but entropy is created whenever energy is transferred in response to a gradient in an intensive variable, like temperature or pressure. In fact, "reversible" means that energy is being transferred without any gradient in temperature, pressure, etc. This never occurs in reality, but we can come arbitrarily close, and it's a useful idealization.

Good luck with your efforts!
 
  • #4
Mr_Bojingles said:
I decided to try and learn what entropy is today and I swear to god I've been sitting here for 4 hours and I still don't have the foggiest idea of what the hell is it? Its driving me insane. I can't think anymore because of the stress that's building up from the fact that I just can't comprehend the concept.

<snip>

I'll be too pissed off to go to bed unless I understand the concept and the way things are looking I'm not going to be sleeping tonight. Can anyone help me understand the concept? If I can just figure out what entropy is I might be calm enough to go to sleep. I'll leave enthalpy for another day.

I'm not sure anyone understands "entropy". It's one of the more difficult concepts, tying in threads from mechanical work, statistics, information, probably some others. Part of the difficulty is there there's not a good definition of 'energy', either.

First, understand that not all energy possessed by an object is accessible. The food we eat, we can only use (IIRC) 80% of the calories.

Second, understand that the amount of energy possessed by an object is due, in part, to it's configuration. A folded protein has a different energy than an unfolded protein. A ball up in the sky has a different energy than the ball on the ground.

"Entropy", as I think of it, is a measure of the amount of energy within an object (or system) that we cannot access. That's not very precise, and there's lots of mathematical derivations assigning a value to that amount of energy: in terms of statistics (kT ln(W)), in terms of information (kT ln(2)), in terms of thermodynamic concepts (dS = dQ/T), probably some others.

I really struggled with thermodynamics for a long time, trying to get an intuitive feel for all those different energies (enthalpy, Gibbs, Helmholtz, etc.), Maxwell relations, logarithmic derivatives, and all the time wondering what's the point. It may help to remember that thermodynamics is one of the oldest branches of Physics- it predates knowledge of atoms. The concepts used may appear foreign to us, as we have become accustomed to quantum mechanics and microscopic dynamics.

That, coupled with an embarrassing lack of a decent undergrad text, causes no end to headaches. FWIW, if you can find a copy of Truesdell's "Tragicomic history of Thermodynamics", you may find it helpful.
 
  • #5
As Andy_Resnick pointed out, thermodynamics is one of the
oldest branches of physics, so it must be possible to make
sense of entropy without recourse to statistical mechanics.
Not that there is anything wrong with statistical interpretations
of entropy. In fact, many thermodynamic properties can be
accurately estimated with the help of statistical mechanics,
and the statisical interpretation adds a great deal of insight
into the concept of entropy.

Have you attempted to calculate the net entropy change when
the two blocks of metal are put into contact with one
another? If you can calculate this, it might help in
understanding the entropy concept better.

Suppose you consider just one block of metal (the system) at
a temperature T2 which is then put into contact
with a huge heat bath held at temperature T1
(i.e. a bath with essentially infintite heat capacity) such
that T2 > T1. After a while, the block
will have cooled to temperature T1 and will have
given up an amount of energy Q = C(T2-T1),
where C is the heat capacity of the metal, here assumed to be
constant over the temperature range considered. Since the heat
bath was kept at temperature T1 the whole time, the
entropy change of the bath is Q/T1.
Unfortunately, we don't know
the entropy change of the block because the cooling process was
not carried out reversibly. To find the change, the system must
be restored reversibly to its original state. This can
be done by placing the block in contact with successively hotter
heat baths whose temperatures differ from each other by an
infinitesimal amount. The heat absorbed by the block in each
infintesimal step is dQ = C dT and the entropy change of the
block is (dQ)/T = (C/T)dT. The total entropy change of the
block is then the integral of this between the limits T1
and T2: C log (T2/T1). Now
the entropy is a function of state, depending only on the
temperature in this case, so that the entropy change of the
block in the cooling process is just the negative of this.
The net change in entropy for the cooling process is therefore

[tex]\Delta S_{total} = \Delta S_{bath} + \Delta S_{block}[/tex]

[tex] = C(T_2 - T_1)/T_1 - C \log (T_2/T_1) > 0 [/tex]

I now invite you to calculate the net entropy change for the
process you mentioned involving two blocks,
one at T1 and one at T2, put into thermal contact with one another.
HINT: Assume this process is adiabatic, i.e. there is no heat
exchange with the bath during the temperature equilibration.
Then restore each block separately to its original state in
order to calculate the net entropy change in the process.

An excellent text is "Thermodynamics" by G.N. Lewis and M. Randall
(McGraw-Hill, 1961)
 
  • #6
Mapes said:
We we do work on a system, there is no change in the population of microstates. The number of possible microstates in the entire universe tends to increase (this is the Second Law).

Why when work is done there is no entropy change ? If work is done by the system = less energy on the system , why the entropy doesn't change when the energy of the system changes ?
 
  • #7
Let's look at a practical example: a gas in a closed container. If we allow the gas to expand reversibly and do work on the environment, its energy decreases as you said (and its temperature also decreases). If the volume had stayed constant, the entropy would have decreased too. However, the volume has increased, allowing the gas more possible microstates. This increase in entropy exactly offsets the decrease due to the loss of energy.

Mathematically,

[tex]dU=T\,dS-p\,dV[/tex]

[tex]dS=\frac{1}{T}\,dU+\frac{p}{T}\,dV=0[/tex]
 
  • #8
Mapes said:
Let's look at a practical example: a gas in a closed container. If we allow the gas to expand reversibly and do work on the environment, its energy decreases as you said (and its temperature also decreases). If the volume had stayed constant, the entropy would have decreased too. However, the volume has increased, allowing the gas more possible microstates. This increase in entropy exactly offsets the decrease due to the loss of energy.

Mathematically,

[tex]dU=T\,dS-p\,dV[/tex]

[tex]dS=\frac{1}{T}\,dU+\frac{p}{T}\,dV=0[/tex]

Thanks Mapes!
 
  • #9
I do not have any quick answers but entopy is the degradation of the quality of energy. If you have an engine that burns fuel, e.g. automobile, energy is being degraded and there is no return to that energy level for the closed system. Ever since the universal "bang" and ever since you were born (or maybe 10 in my son's case) you are going down hill. As time passes, the quality of energy of the universe is decreasing. [This applies to a closed system.]
 
  • #10
I think there is an explanation due as to why Boltzmann's constant 'k' should carry over into statistical mechanics.

James
 
  • #11
Briefly, entropy is proportional to the log of the number of microstates, a dimensionless number. Boltzmann's constant is the constant of proportionality that gives entropy its units (J/K) and connects it to our macroscale measurements of energy and temperature.
 
  • #12
"...Boltzmann's constant is the constant of proportionality that gives entropy its units (J/K) and connects it to our macroscale measurements of energy and temperature. "

It is that connection that I question. Not because I think it is wrong, but because the connection appears to be given from thermodynamics to statistical mechanics for free instead of being derived from statistical mechanics. To me that makes it appear that the equation is still a thermodynamic equation and the statistics part is added on to it, because the statistics are directly related to the reordering of energy. In other words, the success of the equation still is rooted in the thermodynamic derivation of thermodynamic entropy. This is not intended to be taken as an expert opinion.

James
 
  • #13
I tend to think about this in the way explained by Mapes and Nathan.

A qualitative explanation I like goes as follows. Macroscopic objects consist of atoms, which in turn consist of subatomic particles etc. etc. When we in practice describe the world we perceive, we do that in terms of macroscopic variables. We don't specify the exact physical state of objects. Even if we wanted to do that, lack knowledge about the exact fundamental laws of physics would mean that we couldn't do that anyway.

What makes doing physics possible at all is that one can find a closed description of a physical system in terms of only the macroscopic variables plus a handful extra variables that in some way describe the effects of all the degrees of freedom that are left out, in a statistical way. In thermodynamics those extra degrees of freedom are quantities like internal energy, temperature, entropy etc. etc.
 
  • #14
James A. Putnam said:
"...Boltzmann's constant is the constant of proportionality that gives entropy its units (J/K) and connects it to our macroscale measurements of energy and temperature. "

It is that connection that I question. Not because I think it is wrong, but because the connection appears to be given from thermodynamics to statistical mechanics for free instead of being derived from statistical mechanics. To me that makes it appear that the equation is still a thermodynamic equation and the statistics part is added on to it, because the statistics are directly related to the reordering of energy. In other words, the success of the equation still is rooted in the thermodynamic derivation of thermodynamic entropy. This is not intended to be taken as an expert opinion.

James

It is a matter of choosing your units. Thermodynamics gives you a phenomenological description that is not able to explain how all the variables are related to each other. That means that you'll end up with variables that are related at the fundamental level, but in the thermodynamic description you cannot see that relationship.

Historically, the thermodynamic relations were found empirically, and units were invented to measure such quantities like temperature. But if later a more fundamental theory arises and we can now directly compare what used to be incompatible physical quantities, you will end up with new constants, in this case the Boltzmann constant that will do the unit conversion from temperature to energy in the old thermodynamic units.


Comnpare this with special relativity. Einstein found that the inertia of an object is explained by the energy of the object. But in classical physics the two quantities energy and mass are unrelated and we already have defined our supposedly incompatible units for the two quantities. But relativity tells us that declaring the two quanties to be incompatible is wrong and that in fact mass is precisely the rest energy of a object. What then happens is that the equatons will automatically compensate for our previous ignorance by doing the unit conversion inside the equation, i.e. we get the equation E = m c^2 instead of E = m.
 
  • #15
Does anyone know if entropy affects, or applies to non material objects such as information states?
 
  • #16
Of course! Look up "information entropy", 'negentropy', and Shannon's information theory.
 
  • #17
Count Iblis said:
<snip>

Historically, the thermodynamic relations were found empirically, and units were invented to measure such quantities like temperature. <snip>

That's clearly false, amply demonstrated on the other relevant thread.
 
  • #18
Andy Resnick said:
That's clearly false, amply demonstrated on the other relevant thread.

I should have written "phenomenologically" or "heuristically". Physics often proceeds in a heuristic way and only with hindight can you figure out how things really work. That's why, i.m.o., the historical approach to physics teaching is not so good.
 
  • #19
Of course! Look up "information entropy", 'negentropy', and Shannon's information theory.

So if I were to roll a pair of dice, would there be any implications for entropy in relation to the numbers that appeared on the faces? For instance would there be any difference in entropy if a seven came up as opposed to a twelve?
 
  • #20
Count Iblis said:
Historically, the thermodynamic relations were found empirically, and units were invented to measure such quantities like temperature. But if later a more fundamental theory arises and we can now directly compare what used to be incompatible physical quantities, you will end up with new constants, in this case the Boltzmann constant that will do the unit conversion from temperature to energy in the old thermodynamic units.

I am not quite clear about your explanation: Are you saying that Boltzmann's constant did result from the derivation of statistical mechanics or that it was adopted from thermodynamics and used because it was convenient to use it for unit conversion?

James
 
  • #21
James A. Putnam said:
I am not quite clear about your explanation: Are you saying that Boltzmann's constant did result from the derivation of statistical mechanics or that it was adopted from thermodynamics and used because it was convenient to use it for unit conversion?

James

In classical thermodynamics you don't have a Boltzmann constant at all. It appears the moment you relate the thermodynamic quantities to the microscopic degrees of freedom. The value of the Boltzmann constant depends on whatever units one had previously chosen for the temperature and energy, which was rather arbitrary. Those old units are still in use; we don't have the habit of changing our unit systems often.


The reason that it is small when you factor out the J/K unit is that the atomic scale is very far removed from the macro scale.
 
  • #22
Studiot said:
So if I were to roll a pair of dice, would there be any implications for entropy in relation to the numbers that appeared on the faces? For instance would there be any difference in entropy if a seven came up as opposed to a twelve?

Yes. If a twelve comes up I can exactly tell what the microstate of the system is: Both dice are in the "6" state. If a 7 comes up, the set of microstates the system can be in comprises of 6 different states.
 
  • #23
Please go on
 
  • #24
Studiot said:
Please go on


I was assuming that the complete state of each die is specified by which side is up. Of course, if you are considering real dice, then you need to multiply the multiplicity from the uncertainty about the numbers with the number of internal states of the dice.

So, if 7 is the outcome of a throw then that would contribute
kb Log(7) to the entropy. But then, you have to realize that the definition of entropy will be sensitive to how you define the coarse graining procedure that separates the microstates from the macrostates. A slightly different definition will change the entropy of the same system by far more than kb log(7).
 
  • #25
Studiot said:
So if I were to roll a pair of dice, would there be any implications for entropy in relation to the numbers that appeared on the faces? For instance would there be any difference in entropy if a seven came up as opposed to a twelve?

Entropy, in terms of information, is defined by the number of equivalent ways to express a message. The more ways, the higher the entropy. There's more ways to roll a 7 than 12, so 7 has a higher entropy associated with it.

Many compression schemes are based on entropy- the Huffman scheme, for example.

Information theory gets confusing sometimes, because 'information' can be interpreted as 'meaning'. Information is defined as the *opposite* of how well you can predict the next (for example) bit, given you know the value of the current bit. Hence, the idea of 'negentropy'.

http://en.wikipedia.org/wiki/Information_entropy

http://en.wikipedia.org/wiki/Negentropy
 
  • #26
Count Iblis said:
In classical thermodynamics you don't have a Boltzmann constant at all. It appears the moment you relate the thermodynamic quantities to the microscopic degrees of freedom. The value of the Boltzmann constant depends on whatever units one had previously chosen for the temperature and energy, which was rather arbitrary. Those old units are still in use; we don't have the habit of changing our unit systems often.

I see what you mean. The kinetic theory of gas involves the properties of individual molecules. I need to think about that some more. The concern that remains with me is that the units are joules/(molecule*degreeK). I don't see why the size or scale of the units matters when considering physical meaning. The units still contain both energy and temperature. That brings me back to my question about thermodynamic entropy. Energy is changed from a state of usefulness to one of un-usefulness. But, it does this because of something that involves temperature. Both of these properties, regardless of the size of the numbers representing their units are macroscopic, thermodynamic, classical properties. Thank you for your explanation. I think this example of mixing microscopic size with macroscopic properties is worth thinking about.

James
 
  • #27
So where does energy or temperature fit into this scheme since it takes exactly the same energy to roll my dice to any result?
 
  • #28
in short, entropy and enthalpies are COMBINATION properties...
i personally did not undertsand entopy :P
but enthalpy is the combination of internal energy + PV
H = U + PV
hope it helps =)
 
  • #29
Studiot said:
So where does energy or temperature fit into this scheme since it takes exactly the same energy to roll my dice to any result?

Yes, unless you let the energy depend on what side is up, this won't affect energy or temperature. So, let's assume that each dot on the dice has a small mass. The energy of a die then depends on which side is up. You can then consider a large number N of dice and then consider how the laws of thermodynamics are modified.
 
  • #30
Bearing in mind that the dice could be virtual, I have already stated that there is no energy involved in rolling them.

So please go on and put some numbers to the entropy or entropy change since as far as I can see this should be zero divided by the temperature in the experiment room.

Further do the rolls have to be random, eg via a die?
Or can I just state the number 7 or 12 or whatever.

And if I either roll the dice or state 7 does the entropy of the universe increase by some amount due to the number I come up with, every time I do this?

One of my observations about classical entropy is that it is an extensive property (energy) divided by an intensive property (temperature)
Where does that leave us and how does the statistical interpretation cope?
 
Last edited:
  • #31
Count Iblis said:
Yes, unless you let the energy depend on what side is up, this won't affect energy or temperature. So, let's assume that each dot on the dice has a small mass. The energy of a die then depends on which side is up. You can then consider a large number N of dice and then consider how the laws of thermodynamics are modified.

then the dice aren't fair. See, for example, Las Vegas dice- those are balanced.

http://www.gpigaming.com/usa_products_dice.shtml [Broken]
 
Last edited by a moderator:
  • #32
Although it may sometimes make sense to define the information entropy for dices, it certainly is not of much help to call it "physical" entropy of the dices. A dice laying on the table will for ever stay with the same number up, it will not explore the microcanonical states accessible in principle to him. In other words, a dice is not ergodic. That's an important difference to a thermodynamic system.
 
  • #33
Yes, the "dice number degrees of freedom" will not reach thermal equilibrium. But once they are thrown and are rolling, then thermodynamics can beapplied. When they stop rolling, they will be frozen in some state.

This looks like how in the early universe the neutron/proton ratio was frozen at exp(-delta m c^2/k T) with delta m the mass difference between neutrons and protons and T the freeze-out temperature, explaining the present day ratio of hydrogen and helium.
 
  • #34
But that doesn't answer my other questions.

In particular the one about just stating the number. Of course I could pick any number.

And doesn't that bring us straight to that famous phrase

"In the beginning was the Word"

Where have we stopped talking Physics?
 
  • #35
DrDu said:
Although it may sometimes make sense to define the information entropy for dices, it certainly is not of much help to call it "physical" entropy of the dices. A dice laying on the table will for ever stay with the same number up, it will not explore the microcanonical states accessible in principle to him. In other words, a dice is not ergodic. That's an important difference to a thermodynamic system.

It's an interesting problem... if you roll dice but don't look at them, you can apply statistical methods and ergodic processes to make predictions. But once you look at the result, you can't. In the same token, while you are receiving a message from me, the entropy has a meaning. Once you have the message, the entropy is zero, because there is no uncertainty.

Sounds suspiciously like a 'measurement problem'...
 
<h2>1. What is entropy?</h2><p>Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that describes the amount of energy that is unavailable for work in a system.</p><h2>2. How is entropy related to enthalpy?</h2><p>Enthalpy is a measure of the total energy of a system, including both its internal energy and the work required to change its volume. The difference between the enthalpy and the product of the temperature and entropy is known as the Gibbs free energy.</p><h2>3. What is the difference between entropy and enthalpy?</h2><p>The main difference between entropy and enthalpy is that entropy is a measure of the disorder in a system, while enthalpy is a measure of the total energy of a system. Entropy is a state function, while enthalpy is a path function.</p><h2>4. How does entropy affect chemical reactions?</h2><p>Entropy plays a crucial role in determining the spontaneity of a chemical reaction. A reaction is spontaneous if it results in an increase in the entropy of the system. This means that the products of the reaction are more disordered than the reactants.</p><h2>5. Can entropy be reversed?</h2><p>According to the second law of thermodynamics, the total entropy of a closed system will always increase over time. While it is possible to decrease the entropy of a system, it would require an input of energy and would result in an increase in entropy elsewhere. Therefore, the overall entropy of the universe will always increase.</p>

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that describes the amount of energy that is unavailable for work in a system.

2. How is entropy related to enthalpy?

Enthalpy is a measure of the total energy of a system, including both its internal energy and the work required to change its volume. The difference between the enthalpy and the product of the temperature and entropy is known as the Gibbs free energy.

3. What is the difference between entropy and enthalpy?

The main difference between entropy and enthalpy is that entropy is a measure of the disorder in a system, while enthalpy is a measure of the total energy of a system. Entropy is a state function, while enthalpy is a path function.

4. How does entropy affect chemical reactions?

Entropy plays a crucial role in determining the spontaneity of a chemical reaction. A reaction is spontaneous if it results in an increase in the entropy of the system. This means that the products of the reaction are more disordered than the reactants.

5. Can entropy be reversed?

According to the second law of thermodynamics, the total entropy of a closed system will always increase over time. While it is possible to decrease the entropy of a system, it would require an input of energy and would result in an increase in entropy elsewhere. Therefore, the overall entropy of the universe will always increase.

Similar threads

Replies
13
Views
1K
  • Thermodynamics
Replies
2
Views
721
  • Thermodynamics
Replies
17
Views
900
Replies
17
Views
1K
Replies
9
Views
6K
  • Thermodynamics
Replies
19
Views
6K
  • Thermodynamics
Replies
2
Views
8K
Replies
4
Views
1K
Replies
3
Views
922
Back
Top