Calculating Entropy for Independent Coin Toss Sequences with Varying Differences

  • Context: High School 
  • Thread starter Thread starter entropy1
  • Start date Start date
  • Tags Tags
    entropy probability
Click For Summary

Discussion Overview

The discussion revolves around the calculation of entropy for two independent sequences of coin tosses, specifically focusing on how the number of differing tosses between the sequences affects entropy. Participants explore the implications of entropy in relation to probability distributions and the conditions under which entropy is maximized.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant questions whether the entropy of two identical sequences is lower than that of sequences differing by a certain number of tosses, suggesting that sequences differing by N/2 tosses might have the highest entropy.
  • Another participant argues that entropy applies to distributions of probability over many events, not to singular events, and emphasizes that the entropy of a distribution is maximized when all outcomes are equally likely.
  • A participant clarifies that they were referring to the entropy of pairs of sequences differing in a specific number of places, indicating that all sequences have a probability of ##2^{-N}##.
  • Further clarification is provided regarding the calculation of entropy for a specific event, discussing the dependency on the base distribution and the implications of using fair versus unfair coins in the calculations.

Areas of Agreement / Disagreement

Participants express differing views on the application of entropy to singular events versus distributions, with no consensus reached on the initial question regarding the relationship between the number of differing tosses and entropy.

Contextual Notes

Participants mention the need to consider constraints such as fairness of the coins and the definitions of elementary outcomes, which may affect the calculations and interpretations of entropy.

entropy1
Messages
1,232
Reaction score
72
If we have two sequences s1 and s2, both of N coin tosses, is the entropy of getting two sequences that are exactly the same then lower than sequences of which can be said that they differ by x incident tosses? Is the entropy of getting sequences s1 and s2 that differ by N/2 tosses the highest, and is that the reason we most probably get sequences s1 and s2 that differ by about N/2 tosses?
 
Physics news on Phys.org
I believe you misunderstand how entropy applies to probability. There is a singular probability of getting two sequences exactly the same. It is not meaningful to refer to the entropy of such a singular event but rather entropy applies to distributions of probability over many events. So for example if X = the number of differences in the two sequences then there is a single entropy for the probability distribution of X.

Note also that the entropy of a distribution over a discrete finite set of outcomes is maximized when all outcomes are equally likely. But once you begin applying constraints such as the distribution must have a given mean and/or variance etc... then you get more interesting distributions when you maximize entropy.
 
  • Like
Likes   Reactions: Stephen Tashi
jambaugh said:
I believe you misunderstand how entropy applies to probability. There is a singular probability of getting two sequences exactly the same. It is not meaningful to refer to the entropy of such a singular event but rather entropy applies to distributions of probability over many events. So for example if X = the number of differences in the two sequences then there is a single entropy for the probability distribution of X.
I am not sure I understand you, but I was talking about entropy E(x) of s1 and s2 differing in x places, so all the {s1,s2} pairs that differ in x places.
jambaugh said:
Note also that the entropy of a distribution over a discrete finite set of outcomes is maximized when all outcomes are equally likely.
So all sequences have probability ##2^{-N}##.
 
Ok, here's my understanding to more precision. As I said entropy applies to distributions but you can also refer to the entropy of an event (set of elementary outcomes) by equating that to the probability of the conditional probability distribution P(case | case \in E). That still depends on the base distribution. In your example, given the assumed fairness of the coins in question each sequence of tosses would be equally probable with probability 2^{-N}.
Since you are describing two independent sequences of tosses though that would be squared for the probability of a given pair of sequences. It depends on what you want to call the most elementary outcome.

Given that then you pick your even, e.g. that x pairs match between the two sequences and count the number of elementary outcomes in that set. Let's see that will be 2^N ways the first sequence can go times {}_NC_x=\frac{N!}{x!(N-x)!} ways to pick x of the second sequence to disagree with the first. The conditional probability distribution is then q= 1/count = 1/({}_NC_x2^N) for those cases and zero otherwise. The entropy of (the conditional distribution for) that event is then:
S = \sum_{cases} -q\log(q) = -\log(q) = \log(count)

You may say I could have gotten here a bit quicker but consider how things change if the coin is not fair. You'll need to go through this definition since not all elementary outcomes is equally probable and you will need to go through this calculation with the corresponding distributions.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 14 ·
Replies
14
Views
7K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 45 ·
2
Replies
45
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K