Summing X_i with Binomial Distribution: What is the Problem?

  • Thread starter Thread starter autobot.d
  • Start date Start date
  • Tags Tags
    Class
AI Thread Summary
The discussion revolves around calculating the sum S = ∑^m X_i, where X_i are independent and identically distributed (iid) random variables with known mean μ and variance σ², and m follows a Binomial distribution. The initial approach suggests using the mean of S as npμ, but there are concerns about the variance and the independence of the X_i. It's clarified that while the mean can be calculated using the properties of iid variables, the variance requires careful consideration due to the Binomial nature of m. The conversation emphasizes that the problem involves understanding how to combine probabilities and expectations for different cases of m, leading to a weighted average for the final calculations. Overall, the participants seek clarity on handling the Binomial distribution's impact on the sum's statistics.
autobot.d
Messages
67
Reaction score
0
What kind of problem is this?
X_i \textrm{are iid with known mean and variance, } \mu \textrm{ and } \sigma ^2 \textrm{respectively. }
m \sim \textrm{Binomial(n,p), n is known.}

S = \sum^{m}_{i=1} X_i

How do I work with this? This what I have thought of.

S = \sum^{m}_{i=1} X_i = mX_1 \textrm{(since iid)}
so for the mean of S
\bar{S} = \bar{mX_i} = np \mu ?
or to find mean of S use expected value
E(S) = E(mX_i) = E(mX_1) \textrm{ (since iid)}
but then what?

Any help would be appreciated. I am guessing this kind of problem has a name?

Thanks.
 
Physics news on Phys.org
I am missing the problem statement - what are you supposed to do? Calculate mean and variance of S?

As the X_i are independent, their sum is not the same as m times X_1 (this would need 100% correlation),
You can use the mean for X_i in your expression for the mean of S. The result for the mean is right, but you have to fix the calculation steps - otherwise you run into problems with the variance.
 
But what about the fact that the m in the summation limit is Binomially distributed? I do not understand what that does? Thanks.
 
You can express this as "f0 probability that 0 X_i are added, f1 probability that 1 X_i is added, ...", calculate the variance in each of those cases, and combine them afterwards.
For the expectation value, this does not matter, you can take the expectation value of both separately as all X_i have the same distribution.
 
so for "f0 probability that 0 X_i are added", and assuming f0 = {0 \choose n}p^0(1-p)^{n}=(1-p)^n

and so on, then what do I do with all the f0, f1, ...
is the average like a weighted average

pretty lost so any help is greatly appreciated. Thanks.
 
It is like a weighted average, indeed, and it follows the same rules for expectation values and variances.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top