What is Entropy? Understanding the Measurement

  • Thread starter imjustcurious
  • Start date
  • Tags
    Entropy
In summary, entropy is a term that is created by humans to account for our uncertainty about a system. It is not a physical reality, but it is an idea that is used to make sense of the universe.
  • #1
imjustcurious
21
0
Entropy has been described to me many different ways. Overall, I understand it as the measurement of randomness or disorder in a system. This doesn't make sense to me. It seems that it is a term made for humans. For example, if we knew all the information about every particle in the universe, wouldn't the universe be deterministic? We can't do that because we are humans. If you look at it simply, every particle has information and it isn't random. The only reason it seems random is due to the fact that we can't observe it all at once.

I'm sure this is a stupid question and I don't have something right. It just isn't sitting well in my head.
 
Science news on Phys.org
  • #2
Entropy is a probabilistic measurement the reflects our uncertainty about a system. I'm a computer guy way more familiar with information entropy, but I understand physics entropy is essentially the same. Suppose you have a box, divided into two sides. If all the particles are on one side of the box, the entropy is minimal, but as they spread out, the entropy increases. Now if we just know there's some particle we'll call Joe but don't know which it is, we have total information on which side of the box its on if the entropy is minimal, but way near 50/50 uncertainty as they all go out and rattle around and entropy increases. The reason the idea of entropy is powerful is that we don't HAVE to have total information on where Joe is to know something about Joe. Even not knowing which particle Joe is, we still have total certainty Joe is on the left side of the box if all the particles are, and 80% certainty he's on the right if 80% of the particles are. As the entropy gets maximum, we have max uncertainty about the location of Joe, its basically 50/50.

The beauty and power of the idea of it is that it let's us know something useful when there are too many details to actually know what's going on at the small level.
 
  • #4
Thank you for the explanation, it is a bit more clear; however, the main question that is boggling my mind is still unanswered. You said...

"The beauty and power of the idea of it is that it let's us know something useful when there are too many details to actually know what's going on at the small level."

"-it let's US know"
Us as in humans because we can't understand all the particles interactions at the same time. So it is a term coined by humans for practical use. But realistically there isn't entropy because it is some how plausible to know all the information and make the system deterministic. As humans we can't though.
Isn't this true?
 
  • #6
imjustcurious said:
So it is a term coined by humans for practical use.
All physics terms are terms coined by humans. Some are more practical than others.

The rest of what you said isn't true anyway. Classically, you could consider entropy to be related to ignorance of many of the fine degrees of freedom, with some underlying deterministic "truth". Unfortunately, at the level of detail that you would require, the world simply isn't classical.
 
  • #7
imjustcurious said:
Thank you for the explanation, it is a bit more clear; however, the main question that is boggling my mind is still unanswered. You said...

"The beauty and power of the idea of it is that it let's us know something useful when there are too many details to actually know what's going on at the small level."

"-it let's US know"
Us as in humans because we can't understand all the particles interactions at the same time. So it is a term coined by humans for practical use. But realistically there isn't entropy because it is some how plausible to know all the information and make the system deterministic. As humans we can't though.
Isn't this true?
Yeah, this puts me into physics, which I'm starting learning, but its not my forte. What I do remember reading though was that Niels Bohr said that Quantum Mechanics makes sense if we accept the idea that probability is fundamentally real. If the basic building blocks of our universe require probability to be real, than it is real, and entropy is physically real too. But I don't see why it couldn't be used to give descriptions of systems where details could be known but aren't.
 
  • #8
DaleSpam said:
All physics terms are terms coined by humans. Some are more practical than others.
Yes, this is true. So in the universe there isn't entropy. Nothing is random and it is just a idea to account for OUR uncertainty of a system. So if there isn't entropy, then the universe is deterministic. The laws of physics actually exist and govern our universe. Entropy is just used to say that for this volume we are uncertain this much, but realistically there is no uncertainty?
(Sorry if I'm just not understanding something that seems to be obvious)
 
Last edited:
  • #9
imjustcurious said:
Yes, this is true. So in the universe there isn't entropy. Nothing is random and it is just a idea to account for OUR uncertainty of a system. So if there isn't entropy, then the universe is deterministic. The laws of physics actually exist and govern our universe. Entropy is just used to say that for this volume we are uncertain this much, but realistically there is no uncertainty?
(Sorry if I'm just not understanding something that seems to be obvious)
The question you have to ask is deterministic for who? Not us. For an all-knowing being? We we're talking about something related in the computer science forum, my post here mentions a situation we can create where a computer program always comes to the right answer in finite time, but never can KNOW that it came to the right answer, or a logical paradox results that brings down everything. That same paradox would exist for an all knowing observer of the system, but not for an observer with limited knowledge. So there's problems with the idea of an all knowing observer.
 
  • #10
imjustcurious said:
Yes, this is true. So in the universe there isn't entropy. Nothing is random and it is just a idea to account for OUR uncertainty of a system. So if there isn't entropy, then the universe is deterministic. The laws of physics actually exist and govern our universe. Entropy is just used to say that for this volume we are uncertain this much, but realistically there is no uncertainty?
(Sorry if I'm just not understanding something that seems to be obvious)
We don't know if the universe is deterministic or random. However, this view that the universe is deterministic and that the entropy is due to our ignorance is an acceptable one.

http://necsi.edu/projects/baranger/cce.pdf
Chaos, Complexity, and Entropy
A physics talk for non-physicists
 
  • Like
Likes Fooality
  • #11
imjustcurious said:
So in the universe there isn't entropy. Nothing is random
I didn't say any of that. I was just pointing out that your objection is entirely uninformative since it can be applied to all physics terms.
 
  • #12
Fooality said:
The question you have to ask is deterministic for who? Not us. For an all-knowing being?
I'm saying isn' t it possible that the universe, for any being, is deterministic. But what can be determined from the information depends on the being observing it. For example, we could know that the universe is deterministic, but we just can't determine what will happen because we aren't able to.
 
  • #13
DaleSpam said:
I didn't say any of that. I was just pointing out that your objection is entirely uninformative since it can be applied to all physics terms.
Sorry, I phrased it in a bad way which made it sound like you said that. I was just stating what I thought. That since entropy is due to our "ignorance", there is no randomness in the universe and that it is deterministic.
 
  • #14
atyy said:
We don't know if the universe is deterministic or random. However, this view that the universe is deterministic and that the entropy is due to our ignorance is an acceptable one.
Do you know what might be a different view? How might entropy not be do to our ignorance?
 
  • #15
imjustcurious said:
That since entropy is due to our "ignorance", there is no randomness in the universe and that it is deterministic.
That we are ignorant is factual. That there is underlying determinism is pure conjecture which does not follow from the fact of our ignorance nor does it follow from the things that we do know
 
  • Like
Likes anorlunda
  • #16
imjustcurious said:
Do you know what might be a different view? How might entropy not be do to our ignorance?

I agree with what DaleSpam said above. We don't know whether the universe is deterministic or not (how could we know the "ultimate laws", whatever that may mean?). However, within the framework of classical physics, the laws are certainly deterministic, and there one understands entropy as springing from our ignorance. The quantum case is trickier, and in fact still being researched, but I believe that essentially the same idea holds (eg. http://arxiv.org/abs/1007.3957).

So I don't know a different view. I'm just saying that I can't rule out that someone will come up with a different view that makes sense.
 
  • #17
imjustcurious said:
I'm saying isn' t it possible that the universe, for any being, is deterministic. But what can be determined from the information depends on the being observing it. For example, we could know that the universe is deterministic, but we just can't determine what will happen because we aren't able to.

(Going off the Wikipedia definition: Determinism is the philosophical position that for every event there exist conditions that could cause no other event)
The tricky part is certainty about the conditions. For us humans, its limited by our observation. There will always be situations - most in fact - where our observations are limited, and therefore our observations are inadequate for predicting the outcome of events in a deterministic universe. What I'm racking my brain over is if we could ever know that the universe was deterministic even if it was, or if any observer could ever experience the universe as deterministic, always knowing outcomes from observations. The latter I doubt, as I said. I can't see how we'll ever experience a deterministic universe even if we did live in one, or honestly, even know that we were living in one. The biggest problem is I can't see how either idea is falsifiable. If someone claimed the universe was deterministic and you walked in with a by-definition black box of non-determinism, they could always claim ignorance as to the deterministic processes that guide it, that must be there. Or alternatively if someone claims a non-deterministic universe, you could never show there isn't some situation which is beyond your capability to predict outcomes which might be non-deterministic.
 
Last edited:
  • #18
Thanks, this helped a lot. I was very confused.
 

1. What is entropy and why is it important in science?

Entropy is a measure of the disorder or randomness in a system. In science, it is important because it helps us understand the direction and efficiency of processes, such as chemical reactions and energy transfer.

2. How is entropy measured and what are its units?

Entropy is measured in units of joules per kelvin (J/K) in the International System of Units (SI). It is calculated using the formula S = kBlnW, where kB is the Boltzmann constant and W is the number of microstates in a system.

3. What is the relationship between entropy and the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that in any process, the total amount of disorder or randomness will always increase, and the system will move towards a state of maximum entropy.

4. Can entropy be reversed or decreased?

In a closed system, entropy can never decrease, but it can be reversed or decreased locally at the expense of an increase in entropy elsewhere. For example, a refrigerator can decrease the entropy of its contents, but at the cost of increasing the entropy of the surrounding environment.

5. How does entropy relate to information theory?

In information theory, entropy is used to measure the amount of uncertainty or randomness in a message or signal. The higher the entropy, the more unpredictable the message is. This concept is used in fields such as data compression and cryptography.

Similar threads

  • Thermodynamics
Replies
4
Views
230
Replies
3
Views
905
Replies
57
Views
3K
  • Thermodynamics
Replies
17
Views
882
Replies
1
Views
733
Replies
4
Views
1K
  • Thermodynamics
Replies
1
Views
1K
Replies
3
Views
1K
  • Atomic and Condensed Matter
Replies
1
Views
1K
Replies
7
Views
2K
Back
Top