Maximum of entropy and Lagrange multiplier

Click For Summary
SUMMARY

The discussion focuses on maximizing entropy using the Lagrange multiplier method to derive the probability density function, ρ(x). The entropy is defined as S = -∫ ρ(x) ln(ρ(x)) dx, with constraints on the mean value, \bar{x} = ∫ xρ(x)dx, and normalization, ∫ ρ(x) dx = 1. The resulting expression for ρ(x) is given by ρ(x) = e^{-(1 + λ₁ + xλ₂)}, which is normalized to ρ(x) = e^{-xλ₂}/∫ e^{-xλ₂} dx. The user seeks further guidance on simplifying or improving this expression.

PREREQUISITES
  • Understanding of entropy in statistical mechanics
  • Familiarity with Lagrange multipliers in optimization
  • Knowledge of probability density functions
  • Basic calculus, particularly integration techniques
NEXT STEPS
  • Explore the derivation of the exponential family of distributions
  • Study normalization techniques for probability density functions
  • Learn about the method of Lagrange multipliers in greater detail
  • Investigate applications of entropy maximization in statistical mechanics
USEFUL FOR

Mathematicians, physicists, statisticians, and anyone interested in optimization techniques in probability theory and statistical mechanics.

Nico045
Messages
9
Reaction score
0
Hello, I have to find the density of probability which gives the maximum of the entropy with the following constraint[tex]\bar{x} = \int x\rho(x)dx[/tex]
[tex]\int \rho(x) dx = 1[/tex]

the entropy is : [tex]S = -\int \rho(x) ln(\rho(x)) dx[/tex]

[tex]L = -\int \rho(x) ln(\rho(x)) dx - \lambda_1 ( \int \rho(x) dx -1 ) - \lambda_2 (\int x \rho(x)dx -\bar{x})[/tex]

[tex]\frac {\partial L } { \partial \rho(x) } = \int ( - ln(\rho(x)) -1 - \lambda_1 - x \lambda_2 ) dx= 0[/tex]

[tex]\rho(x) = e^{-(1 + \lambda_1 + x \lambda_2)}[/tex]

Now I use the normalisation

[tex]\int \rho(x) dx = 1 = e^{-(1 + \lambda_1) } \int e^{-x\lambda_2} dx \Rightarrow e^{-(1 + \lambda_1) } = \frac{1}{\int e^{-x\lambda_2} dx}[/tex]

[tex]\rho(x) = \frac{e^{-x \lambda_2}}{\int e^{-x\lambda_2} dx}[/tex]

From there I don't really know what to do. What shall I do to get a better expression of this ?
 
Science news on Phys.org
Does anyone have an idea ? Maybe I can't do better
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
4K
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
687
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 10 ·
Replies
10
Views
1K
Replies
3
Views
2K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K