Minimize entropy of P = maximize entropy of ?

  • Thread starter mnb96
  • Start date
  • Tags
    Entropy
In summary, the conversation discusses the possibility of finding a parameter that minimizes the entropy of a conditional probability distribution, and whether it can be reformulated as a maximum-entropy problem. The conclusion is that it is not possible, and there is no obvious reason why one would want to do so.
  • #1
mnb96
715
5
minimize entropy of P == maximize entropy of ?

Hello,
I am facing the following problem:

- I have distribution (or function) which depends on some parameter.
- I want to find the parameter which minimizes the entropy of the distribution.

In the particular situation I am facing I really need to reformulate this problem in a entropy-maximization problem.

In other words, is it possible to find a P' for which maximizing its entropy, is equivalent to minimize the entropy of P?

Thanks in advance!
 
Physics news on Phys.org
  • #2


If I understand your question right, you have a conditional probability distribution, e.g. p(x|y) conditional on y. You want to find the y that minimizes the entropy of p(x|y).

Then you hope finding some transformation x -> z so that the y that maximizes the entropy of p(z|y) minimizes that of p(x|y).

In that case, the short answer is: no, it is not possible.
 
  • #3


Perhaps you could be a bit more specific about what you're trying to do? Why are you interested in the minimum-entropy parameter, and why would you want to recast such a problem as a maximum-entropy problem? I can think of some examples where an entropy minimization could be re-written as an entropy maximization of a related distribution, but they're all kind of trivial, and I can't think of a case where doing so wouldn't just be a needless complication.
 

1. What is entropy?

Entropy is a measure of disorder or randomness in a system. In scientific terms, it is the logarithm of the number of microstates that are available to a system, which is related to the number of ways energy can be distributed among the particles in the system.

2. How is entropy related to the concept of disorder?

The higher the entropy of a system, the more disordered it is. This means that there are more possible arrangements of the particles in the system, making it more difficult to predict the exact state of the system at any given time.

3. Why do we want to minimize entropy in some cases and maximize it in others?

In certain systems, such as a closed thermodynamic system, minimizing entropy is desirable because it leads to a more stable and organized state. On the other hand, in open systems, such as living organisms, maximizing entropy is necessary for maintaining the system's energy flow and preventing it from reaching a state of equilibrium.

4. How can we minimize entropy of a system?

There are several ways to minimize entropy, including reducing the number of particles or energy levels in a system, decreasing the temperature, or increasing the organization or complexity of the system.

5. Can we completely eliminate entropy from a system?

No, according to the Second Law of Thermodynamics, the total entropy of a closed system will always increase over time. However, it is possible to locally decrease entropy in a system by expending energy, such as in the formation of complex structures like crystals or living organisms.

Similar threads

Replies
13
Views
1K
Replies
2
Views
805
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Thermodynamics
Replies
2
Views
702
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Thermodynamics
Replies
1
Views
685
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
972
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Calculus
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
Back
Top