Small scale entropy and information loss

  • #1

anorlunda

Staff Emeritus
Insights Author
11,211
8,623
I'm struggling to understand the implications and origins of the 2nd law.

Entropy is such a slippery subject. Wikipedia has many many definitions of entropy. I've been studying Professor Susskind physics lectures, so I'm most interested in his favorite definition: that information is conserved but entropy is a measure of hidden information. I also know that entropy is only supposed to be a macro concept, but I can't help thinking about it on the micro level.

Suppose we have two particles that come together and merge. Particle 1 with momentum p1 and particle 2 with momentum p2. After collision the product is particle 3 with momentum p3 and particle 4 with momentum p4. It should work with any particles, but I'm thinking of electrons and photons.

Let us say that the collision conserves mass, energy and momentum, but I'll only consider momentum.

Conservation of momentum gives us the vector equation (p3+p4)=(p1+p2). p3 and p4 are observable. However, there are an infinite number of values of p1 and p2 that satisfy (p3+p4)=(p1+p2). The information of which specific p1 and p2 we started with is lost, or at least hidden. Right?

Does the entropy of this system increase because of the collision? On one hand, the quantity of information seems the same. i.e. the values of p1 and p2 pre-collision versus the values of p3 and p4 post-collision. However, the specific values of both p1 and p2 can not be observed post collision. In other words, the number of bits needed to describe the momenta seems conserved, but the message encoded by the pre-collision bits is destroyed.

The next logical step is to consider time reversal symmetry and reversibility of the 2nd law, which led me to Loschmidt's paradox. I confess the the explanations in that article are beyond my understanding, so I'll limit my question.

Is this way of thinking about particles, information and entropy valid?
 

Answers and Replies

  • #2
In general, that classical (!) collision of point-like particles (!) will conserve the information content. Afterwards, you don't know p1 and p2 - but before, you don't know p3 and p4, so you have the same amount of knowledge.
 

Suggested for: Small scale entropy and information loss

Replies
57
Views
2K
Replies
20
Views
574
Replies
12
Views
310
Replies
2
Views
473
Replies
10
Views
789
Replies
3
Views
478
Replies
2
Views
399
Back
Top