Minimize entropy of P = maximize entropy of ?

  • Context: Graduate 
  • Thread starter Thread starter mnb96
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary
SUMMARY

The discussion centers on the relationship between minimizing the entropy of a distribution P and maximizing the entropy of a transformed distribution P'. The user seeks to reformulate an entropy minimization problem into an entropy maximization problem, specifically regarding a conditional probability distribution p(x|y). The consensus is that it is not possible to directly equate the minimization of entropy in P with the maximization of entropy in P' through a transformation. The discussion highlights the complexity and potential triviality of such transformations.

PREREQUISITES
  • Understanding of entropy in probability distributions
  • Knowledge of conditional probability distributions, specifically p(x|y)
  • Familiarity with transformations in statistical distributions
  • Basic concepts of optimization in statistical contexts
NEXT STEPS
  • Research the properties of entropy in probability theory
  • Explore conditional probability and its applications in statistics
  • Study transformations of random variables and their implications
  • Investigate optimization techniques in statistical modeling
USEFUL FOR

Statisticians, data scientists, and researchers interested in entropy optimization and its applications in probability theory.

mnb96
Messages
711
Reaction score
5
minimize entropy of P == maximize entropy of ?

Hello,
I am facing the following problem:

- I have distribution (or function) which depends on some parameter.
- I want to find the parameter which minimizes the entropy of the distribution.

In the particular situation I am facing I really need to reformulate this problem in a entropy-maximization problem.

In other words, is it possible to find a P' for which maximizing its entropy, is equivalent to minimize the entropy of P?

Thanks in advance!
 
Physics news on Phys.org


If I understand your question right, you have a conditional probability distribution, e.g. p(x|y) conditional on y. You want to find the y that minimizes the entropy of p(x|y).

Then you hope finding some transformation x -> z so that the y that maximizes the entropy of p(z|y) minimizes that of p(x|y).

In that case, the short answer is: no, it is not possible.
 


Perhaps you could be a bit more specific about what you're trying to do? Why are you interested in the minimum-entropy parameter, and why would you want to recast such a problem as a maximum-entropy problem? I can think of some examples where an entropy minimization could be re-written as an entropy maximization of a related distribution, but they're all kind of trivial, and I can't think of a case where doing so wouldn't just be a needless complication.
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K