Proof related to the expected value of cardinality

Click For Summary
SUMMARY

The discussion centers on the expected value of sums of random variables defined by two sets, A and B, each containing elements from a Bernoulli distribution B(r_n). The key conditions include |A ∩ I_1| ≥ |B ∩ I_1| and |A| = |B| = n, where I_1 is a subset of indices. Despite initial intuition suggesting that the expected value of A should be greater than or equal to that of B, counterexamples demonstrate that this is not always the case. Specifically, if the probabilities r_1 and r_2 differ significantly, it can lead to scenarios where the expected value of B exceeds that of A.

PREREQUISITES
  • Understanding of Bernoulli distributions and expected values
  • Familiarity with set theory concepts, particularly subsets and intersections
  • Knowledge of probability theory, specifically the properties of random variables
  • Basic mathematical proof techniques to analyze inequalities
NEXT STEPS
  • Explore advanced concepts in probability theory, focusing on expected values of random variables
  • Investigate the implications of set theory in probability, particularly in relation to random variables
  • Study counterexamples in probability to understand conditions under which expected values can differ
  • Examine applications of probability theory in electrical engineering, particularly in decision-making processes
USEFUL FOR

Mathematicians, statisticians, electrical engineers, and anyone interested in the intersection of probability theory and set theory.

baiyang11
Messages
8
Reaction score
0
Consider N random variables X_{n} each following a Bernoulli distribution B(r_{n}) with 1 \geq r_{1} \geq r_{2} \geq ... \geq r_{N} \geq 0. If we make following assumptions of sets A and B:

(1) A \subset I and B \subset I with I=\{1,2,3,...,N\}

(2) |A \cap I_{1}| \geq |B \cap I_{1}| with I_{1}=\{1,2,3,...,n\}, n<N

(3) |A|=|B|=n

Do we have \mathbb{E}(\Sigma_{a\in A} X_{a}) \geq\mathbb{E}(\Sigma_{b\in B} X_{b})?

To avoid confusion, \mathbb{E} means expected value.Thanks!
 
Physics news on Phys.org
This question looks really similar to your previous one so I hope I can be of more help this time. I'll give my thoughts about this question.

We have given $N$ random variables with a Bernoulli distribution. That means $\mathbb{P}[X_n = 1] = r_n$ and $\mathbb{P}[X_n = 0]=1-r_n$. Furthermore, $\mathbb{E}[X_i]=r_i$ for $i=1,\ldots,N$.

By condition $(2) |A \cap I_1| \geq |B \cap I_1| \neq 0 $ we know that $A$ has more elements in common with $I_1$. Since the random variables $X_1,\ldots,X_n$ have greater expected values then the random variables $X_j$ with $j \in I \backslash I_1$, the statement
$$\mathbb{E}\left(\sum_{a \in A} X_a \right) \geq \mathbb{E}\left(\sum_{b \in B} X_b \right)$$
probably holds. Of course this is not a formal proof but it should give some intuition.

The only thing we know about $n$ and $N$ is $n<N$ so it could be the case that $N>2n$. In this case we can suppose $A \cap I_1 = \emptyset$ and $B \cap I_1 = \emptyset$. Now it is possible to choose $A$ en $B$ such that the statement does not hold anymore.

To make my point more clear. Suppose $I = \{1,2,\ldots,7\}$ and $I_1 = \{1,2,3\}$, thus $n=3$ and $N = 7$. Set $A = \{5,6,7\}$ and $B=\{4,5,6\}$. Then we have
$$\mathbb{E}\left(\sum_{a \in A} X_a \right) = \mathbb{E}[X_5]+\mathbb{E}[X_6]+\mathbb{E}[X_7] = r_5+r_6+r_7$$
$$\mathbb{E}\left(\sum_{b \in B} X_b \right) = r_4+r_5+r_6$$.

Since we can say $r_4 > r_7$, we have in this case that the statement does not hold.

I hope I made my point clear though I did not give a formal explanation. But I wanted to give my thoughts because in my opinion this is an interesting question.
 
Siron said:
This question looks really similar to your previous one so I hope I can be of more help this time. I'll give my thoughts about this question.

We have given $N$ random variables with a Bernoulli distribution. That means $\mathbb{P}[X_n = 1] = r_n$ and $\mathbb{P}[X_n = 0]=1-r_n$. Furthermore, $\mathbb{E}[X_i]=r_i$ for $i=1,\ldots,N$.

By condition $(2) |A \cap I_1| \geq |B \cap I_1| \neq 0 $ we know that $A$ has more elements in common with $I_1$. Since the random variables $X_1,\ldots,X_n$ have greater expected values then the random variables $X_j$ with $j \in I \backslash I_1$, the statement
$$\mathbb{E}\left(\sum_{a \in A} X_a \right) \geq \mathbb{E}\left(\sum_{b \in B} X_b \right)$$
probably holds. Of course this is not a formal proof but it should give some intuition.

The only thing we know about $n$ and $N$ is $n<N$ so it could be the case that $N>2n$. In this case we can suppose $A \cap I_1 = \emptyset$ and $B \cap I_1 = \emptyset$. Now it is possible to choose $A$ en $B$ such that the statement does not hold anymore.

To make my point more clear. Suppose $I = \{1,2,\ldots,7\}$ and $I_1 = \{1,2,3\}$, thus $n=3$ and $N = 7$. Set $A = \{5,6,7\}$ and $B=\{4,5,6\}$. Then we have
$$\mathbb{E}\left(\sum_{a \in A} X_a \right) = \mathbb{E}[X_5]+\mathbb{E}[X_6]+\mathbb{E}[X_7] = r_5+r_6+r_7$$
$$\mathbb{E}\left(\sum_{b \in B} X_b \right) = r_4+r_5+r_6$$.

Since we can say $r_4 > r_7$, we have in this case that the statement does not hold.

I hope I made my point clear though I did not give a formal explanation. But I wanted to give my thoughts because in my opinion this is an interesting question.
Thank you for reply.
You provide a clear example when $A$ and $B$ have nothing in common with $I_{1}$ at all. But I think my original intention is to make $|A \cap I_{1}| \geq |B \cap I_{1}|>0$. I admit it is my mistake not to make this clear.
So if $|A \cap I_{1}| \geq |B \cap I_{1}|>0$, is the answer "YES" or "NO"?

It may be still No. Someone else gives me the following example.
Let $n \geq 2$, $N=n+1, A=I-\{1\}, B=I-\{2\}$ and $r_{1}>r_{2}$. Then
$$\mathbb{E}\left(\sum_{a \in A} X_a \right)=\sum_{i} r_{i}-r_{1} $$
$$\mathbb{E}\left(\sum_{b \in B} X_b \right)=\sum_{i} r_{i}-r_{2} $$
With $r_{1}>r_{2}$, we get $\mathbb{E}\left(\sum_{a \in A} X_a \right)<\mathbb{E}\left(\sum_{b \in B} X_b \right)$ to make the answer "NO".

In fact, by asking my original question, I what to abstract the following statement which seems to be true intuitively.

If I have better knowledge of what incidents have the top probabilities to happen (like I know set A rather than set B), is the knowledge trustable (A is still favored) when those incidents realize (the expected value thing)?

Now, inspired by the counterexample above, it seems that we can always manipulate different $r_{i}$ to get $B$ favored with $|A|=|B|$.

Now that under the original assumptions $\mathbb{E}\left(\sum_{a \in A} X_a \right) \geq \mathbb{E}\left(\sum_{b \in B} X_b \right)$ does not always hold, I am trying to see under what assumptions, we can always favor A, i.e. $\mathbb{E}\left(\sum_{a \in A} X_a \right) \geq \mathbb{E}\left(\sum_{b \in B} X_b \right)$. But I am a bit lost.
 
baiyang11 said:
Thank you for reply.
You provide a clear example when $A$ and $B$ have nothing in common with $I_{1}$ at all. But I think my original intention is to make $|A \cap I_{1}| \geq |B \cap I_{1}|>0$. I admit it is my mistake not to make this clear.
So if $|A \cap I_{1}| \geq |B \cap I_{1}|>0$, is the answer "YES" or "NO"?

It may be still No. Someone else gives me the following example.
Let $n \geq 2$, $N=n+1, A=I-\{1\}, B=I-\{2\}$ and $r_{1}>r_{2}$. Then
$$\mathbb{E}\left(\sum_{a \in A} X_a \right)=\sum_{i} r_{i}-r_{1} $$
$$\mathbb{E}\left(\sum_{b \in B} X_b \right)=\sum_{i} r_{i}-r_{2} $$
With $r_{1}>r_{2}$, we get $\mathbb{E}\left(\sum_{a \in A} X_a \right)<\mathbb{E}\left(\sum_{b \in B} X_b \right)$ to make the answer "NO".

In fact, by asking my original question, I what to abstract the following statement which seems to be true intuitively.

If I have better knowledge of what incidents have the top probabilities to happen (like I know set A rather than set B), is the knowledge trustable (A is still favored) when those incidents realize (the expected value thing)?

Now, inspired by the counterexample above, it seems that we can always manipulate different $r_{i}$ to get $B$ favored with $|A|=|B|$.

Now that under the original assumptions $\mathbb{E}\left(\sum_{a \in A} X_a \right) \geq \mathbb{E}\left(\sum_{b \in B} X_b \right)$ does not always hold, I am trying to see under what assumptions, we can always favor A, i.e. $\mathbb{E}\left(\sum_{a \in A} X_a \right) \geq \mathbb{E}\left(\sum_{b \in B} X_b \right)$. But I am a bit lost.

Nice counterexample! I did not see it that way. Anyway your question is interesting but not something I can answer right away. I'll think about it but in the mean time I hope some other members can help you out.

Ps: where does this problem come from? I have had some probability theory courses but I've never discussed problems like this before.
 
Siron said:
Nice counterexample! I did not see it that way. Anyway your question is interesting but not something I can answer right away. I'll think about it but in the mean time I hope some other members can help you out.

Ps: where does this problem come from? I have had some probability theory courses but I've never discussed problems like this before.

I appreciate your continuous attention on this problem and I do hope attentions from some other members, maybe from various areas, because it seems to be interdisciplinary between set theory and probability theory.

This abstracted problem comes from a concrete problem in my major, which is electrical engineering (EE). We come up with a concrete method for a EE problem we are researching currently. Then I realize the essence of the concrete method is trying to answer the question, the italic one in my previous post. Then I looked at the EE problem and what we do in the concrete method, and come up with this mathematical problem, which I found interesting as well.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
2
Views
2K