Calculating Entropy for Independent Coin Toss Sequences with Varying Differences

In summary: Given the coin is not fair then the most elementary outcomes are not equally probable and so the entropy would be different.
  • #1
entropy1
1,230
71
If we have two sequences s1 and s2, both of N coin tosses, is the entropy of getting two sequences that are exactly the same then lower than sequences of which can be said that they differ by x incident tosses? Is the entropy of getting sequences s1 and s2 that differ by N/2 tosses the highest, and is that the reason we most probably get sequences s1 and s2 that differ by about N/2 tosses?
 
Physics news on Phys.org
  • #2
I believe you misunderstand how entropy applies to probability. There is a singular probability of getting two sequences exactly the same. It is not meaningful to refer to the entropy of such a singular event but rather entropy applies to distributions of probability over many events. So for example if X = the number of differences in the two sequences then there is a single entropy for the probability distribution of X.

Note also that the entropy of a distribution over a discrete finite set of outcomes is maximized when all outcomes are equally likely. But once you begin applying constraints such as the distribution must have a given mean and/or variance etc... then you get more interesting distributions when you maximize entropy.
 
  • Like
Likes Stephen Tashi
  • #3
jambaugh said:
I believe you misunderstand how entropy applies to probability. There is a singular probability of getting two sequences exactly the same. It is not meaningful to refer to the entropy of such a singular event but rather entropy applies to distributions of probability over many events. So for example if X = the number of differences in the two sequences then there is a single entropy for the probability distribution of X.
I am not sure I understand you, but I was talking about entropy E(x) of s1 and s2 differing in x places, so all the {s1,s2} pairs that differ in x places.
jambaugh said:
Note also that the entropy of a distribution over a discrete finite set of outcomes is maximized when all outcomes are equally likely.
So all sequences have probability ##2^{-N}##.
 
  • #4
Ok, here's my understanding to more precision. As I said entropy applies to distributions but you can also refer to the entropy of an event (set of elementary outcomes) by equating that to the probability of the conditional probability distribution [itex] P(case | case \in E)[/itex]. That still depends on the base distribution. In your example, given the assumed fairness of the coins in question each sequence of tosses would be equally probable with probability [itex]2^{-N}[/itex].
Since you are describing two independent sequences of tosses though that would be squared for the probability of a given pair of sequences. It depends on what you want to call the most elementary outcome.

Given that then you pick your even, e.g. that x pairs match between the two sequences and count the number of elementary outcomes in that set. Let's see that will be [itex]2^N[/itex] ways the first sequence can go times [itex]{}_NC_x=\frac{N!}{x!(N-x)!}[/itex] ways to pick [itex]x[/itex] of the second sequence to disagree with the first. The conditional probability distribution is then [itex]q= 1/count = 1/({}_NC_x2^N)[/itex] for those cases and zero otherwise. The entropy of (the conditional distribution for) that event is then:
[tex] S = \sum_{cases} -q\log(q) = -\log(q) = \log(count)[/tex]

You may say I could have gotten here a bit quicker but consider how things change if the coin is not fair. You'll need to go through this definition since not all elementary outcomes is equally probable and you will need to go through this calculation with the corresponding distributions.
 

1. What is entropy?

Entropy is a measure of the disorder or randomness of a system. In the context of thermodynamics, it is a measure of the unavailable energy in a closed system that is no longer able to do work. In probability theory, it is a measure of the uncertainty or randomness in a set of possible outcomes.

2. How is entropy related to probability?

In probability theory, entropy is a measure of the amount of uncertainty or randomness in a system. It is calculated as the negative of the sum of the probability of each possible outcome multiplied by the logarithm of that probability. This means that systems with a high degree of uncertainty have a high entropy, while systems with low uncertainty have a low entropy.

3. Can entropy be negative?

In thermodynamics, entropy is always positive or zero. This is because it is a measure of the unavailable energy in a system, and energy cannot be negative. However, in probability theory, entropy can be negative. This occurs when the probability of an outcome is greater than one, which is not possible in a physical system.

4. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This is because in any physical process, some energy is lost as heat, which increases the entropy of the system. Entropy can also be seen as a measure of the direction of time, as systems tend to move from a state of order to disorder (low entropy to high entropy) over time.

5. What are some real-world applications of entropy and probability?

Entropy and probability are used in a wide range of fields, including thermodynamics, information theory, and statistics. In thermodynamics, entropy is used to calculate the efficiency of engines and predict the direction of chemical reactions. In information theory, entropy is used to measure the amount of information in a message. In statistics, entropy is used to measure the uncertainty in a data set and helps with decision-making and prediction.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
57
Views
2K
  • Programming and Computer Science
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
45
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Thermodynamics
Replies
1
Views
736
  • Set Theory, Logic, Probability, Statistics
Replies
15
Views
2K
Back
Top