Understanding Brillouin's Entropy Decrease

  • Context: Graduate 
  • Thread starter Thread starter LTP
  • Start date Start date
  • Tags Tags
    Brillouin Entropy
Click For Summary

Discussion Overview

The discussion revolves around Brillouin's treatment of entropy decrease in the context of information theory, specifically how he transitions from the expression k·Δ(log P) to -k·(p/P_0). Participants explore the implications of Brillouin's ideas, their historical context, and the mathematical reasoning behind the entropy decrease, while also addressing the concept of "negentropy" and its relevance.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Historical

Main Points Raised

  • One participant questions the transition from k·Δ(log P) to -k·(p/P_0) and seeks clarification on Brillouin's reasoning.
  • Another participant critiques the relevance of Brillouin's work, suggesting it is outdated and not taken seriously in modern discussions.
  • A different participant acknowledges reading Brillouin's paper as a historical document and expresses uncertainty about the mathematical steps involved.
  • Some participants clarify that the approximation used in Brillouin's work is valid under the condition that p is much smaller than P_0, relating it to the Taylor expansion of log(1 - ε).
  • There is a discussion about the term "negentropy," with some participants expressing strong opinions against its use and relevance in contemporary physics.
  • One participant raises a question regarding the absence of the 3/2 factor in the context of thermal energy in Brillouin's example involving a gas at constant temperature.

Areas of Agreement / Disagreement

Participants express differing views on the validity and relevance of Brillouin's ideas, with some acknowledging his work as obsolete while others seek to understand it. The discussion remains unresolved regarding the implications of "negentropy" and the specific mathematical details of Brillouin's entropy decrease.

Contextual Notes

Participants note that Brillouin's concepts may be outdated and that the discussion includes references to historical documents and modern critiques. The mathematical steps and assumptions underlying the entropy decrease are not fully resolved, particularly concerning the interpretation of thermal energy.

Who May Find This Useful

This discussion may be of interest to those studying the historical development of information theory, the relationship between information and entropy, and the critiques of classical theories in modern physics.

LTP
Messages
24
Reaction score
0
I do not quite understand how Brillouin goes from [tex]k\cdot \Delta (\log P)[/tex] to [tex]-k\cdot \frac{p}{P_0}[/tex] in this context:

Once the information is obtained, it can be used to decrease the entropy of the system. The entropy of the system is
[tex]S_0=k\ln P_0[/tex]
according to Boltzmann's formula, where [tex]P_0[/tex] represents the total number of microscopic configurations (Planck's "complexions") of the system. After the information has been obtained, the system is more completely specified. P is decreased by an amount p and P_1=P_0-p . The entropy decrease is then
[tex]\Delta S_i = S-S_0 = k\cdot \Delta (\log P) = -k\cdot \frac{p}{P_0}[/tex]
It is obvious that p<<P_0 in all practical cases.
from "Maxwell's Demon cannot operate: Information and Entropy", L. Brillouin, 1950.

Could anybody offer a meaningful explanation?

[I added the "The entropy decrease is then"-bit because the tex wouldn't display properly.]
 
Last edited:
Science news on Phys.org
Gosh, why are you reading THAT?

[EDIT: In this post I was responding to an earlier version of Post #1 in this thread, in which due to what turned out to be a spelling error, the OP appeared to mention the sternly deprecated term "negentropy", which provoked me to order all hands to action stations, as it were! See my Post #5 below for further discussion of this misunderstanding.]

I hope you are not reading that paper (BTW, shouldn't you cite it properly?) because someone recommended it but only because you stumbled over it, not realizing it's a bit like stumbling over and studying a treatise on Ptolemy's model of the solar system, in ignorance of the fact that this model was discarded long ago!

Similarly, Brillouin eventually developed his ideas on information theory into a book (L. Brillouin, Science and information theory, Academic Press, 1962) which was obsolete when it came out and has long ago been tossed by mathphy researchers into the dustbin of failed scientific monographs. In particular, the concept of negentropy (you mispelled the word!), of which he made such a fuss in that book, was never a sensible quantity to define, was never taken seriously by the mathematical literati, never became standard in math/physics and nowadays is only used by persons (mostly biologists) who don't realize how silly it makes them sound (kinda like boasting about your gaily painted new donkey cart, not realizing that all your neighbors drive Ferrari roadsters).

A good place to start learning about more modern approaches might be Thomas & Cover, Elements of Information Theory, Wiley, 1991, followed by the old Sci. Am article of Charles Bennett. ("Explanations" of Maxwell's demon remain controversial to this day, but Brillouin's ideas were firmly discarded long long ago; Bennett's ideas are least still seriously discussed.)

With that out of the way, if you promise to obtain a modern book, we can discuss the underlying question (discarding the absurd notion of "negentropy", which isn't helping here or anywhere else that I know of).
 
Last edited:
Sorry, it should have been "entropy", not "netropy" (or "negentropy"). I've corrected it now [EDIT: I also corrected the formulas, so please reread].

Yes, I am reading this paper, but only as a "historical" document. I am aware that Brillouin's ideas are obsolete. I do have Bennett's article (and Landauer article on erasure).
This article, as well as the two others and numerous more, is printed in "Maxwell's demon 2" by Leff and Rex which is basically a compilation of different more or less relevant articles about Maxwell's demon, Smoluchowski's trapdoor and the Szilard engine.Anyways, back to the original question. I'm sure Brillouin could do his math, I'm just not quite sure how :)
 
Last edited:
LTP said:
I do not quite understand how Brillouin goes from [tex]k\cdot \Delta (\log P)[/tex] to [tex]-k\cdot \frac{p}{P_0}[/tex] in this context:


from "Maxwell's Demon cannot operate: Information and Entropy", L. Brillouin, 1950.

Could anybody offer a meaningful explanation?

[I added the "The entropy decrease is then"-bit because the tex wouldn't display properly.]

he is using [tex]\Delta log P = log(P_0-p) - log(P_0) = log(1 - \frac{p}{P_0}) \approx -\frac{p}{P_0}[/tex]

where the last step is the first term of the Taylor expansion of log(1-epsilon) so it's valid as long as p is much smaller than P_0.
 
I assume Nrqed (who said what I was going to say) cleared up the problem, but I can't help adding some remarks on the "negentropy" flap:

LTP said:
Sorry, it should have been "entropy", not "netropy" (or "negentropy")...

Yes, I am reading this paper, but only as a "historical" document. I am aware that Brillouin's ideas are obsolete.

Well, as this shows, mentioning "negentropy" or "Shannon-Weaver entropy" [sic] in my presence is like waving a red flag--- I'll charge!

When you have a spare half hour, you might get a kick out of [thread=200063]this thread[/thread] (gosh, 63 threads earlier and I would started the 200,000 PF thread since the dawn of time!) and [thread=199303]this thread[/thread] which are examples of threads in which various posters bewail a phenomenon well supported by observation, namely that few newbie PF posters seem to bothering to
  • read carefully,
  • write carefully, or even to obey such basic rules as checking their spelling
As the cited threads show, there has been some spirited discussion about how to try to train them to do things the scholarly way.
 
Last edited:
nrqed said:
he is using [tex]\Delta log P = log(P_0-p) - log(P_0) = log(1 - \frac{p}{P_0}) \approx -\frac{p}{P_0}[/tex]

where the last step is the first term of the Taylor expansion of log(1-epsilon) so it's valid as long as p is much smaller than P_0.
Ah yes, thank you.

Chris Hillman said:
I assume Nrqed (who said what I was going to say) cleared up the problem, but I can't help adding some remarks on the "negentropy" flap:
Well, as this shows, mentioning "negentropy" or "Shannon-Weaver entropy" [sic] in my presence is like waving a red flag--- I'll charge!

When you have a spare half hour, you might get a kick out of [thread=200063]this thread[/thread] (gosh, 63 threads earlier and I would started the 200,000 PF thread since the dawn of time!) and [thread=199303]this thread[/thread] which are examples of threads in which various posters bewail a phenomenon well supported by observation, namely that few newbie PF posters seem to bothering to
  • read carefully,
  • write carefully, or even to obey such basic rules as checking their spelling
As the cited threads show, there has been some spirited discussion about how to try to train them to do things the scholarly way.

Yes, sorry, I will take that into consideration next time.
About negentropy, Brillouin mentions in the next paragraph :)While we are at it, could you explain why it is k*T, and not 3/2k*T in this context:
Our system is composed of the following elements:
1) A charged battery and an electric bulb, representing the electric torch
2) A gas at constant temperature T_0 contained in Maxwell's enclosure [...]
[..]
The battery heats up the filament at a high temperature T_1:
T_1 >> T_0
This condition is required, in order to obtain visible light:
h*v >> k*T_0
that can be distinguished from the background of the blackbody radiation in the enclosure at temperature T_0
So E_light = h*v, but what is k*T_0? It can't be the thermal energy of the gas particles, since the 3/2-factor is missing, or what?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 21 ·
Replies
21
Views
5K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 62 ·
3
Replies
62
Views
15K
  • · Replies 17 ·
Replies
17
Views
3K
Replies
9
Views
3K
Replies
2
Views
4K