Mean of Sum of IID Random Variables

AI Thread Summary
The mean of the sum of n independent identically distributed (IID) random variables is indeed the sum of their means, expressed as E[Y] = nE[X]. This holds true regardless of whether the random variables are independent or have the same distribution. The principle that the mean of a sum equals the sum of the means applies universally to any collection of random variables. Clarification was provided on the application of this concept beyond single-variable functions. Understanding this property is essential in probability and statistics.
ObliviousSage
Messages
36
Reaction score
0
If X is some RV, and Y is a sum of n independent Xis (i.e. n independent identically distributed random variables with distribution X), is the mean of Y just the sum of the means of the n Xs?

That is, if Y=X1+X2+...+Xn, is E[Y]=nE[X]?

I know that for one-to-one order-preserving functions, if Y=h(X) then E[Y]=E[h(X)] with a single variable X, but I'm not sure if it works with multiple Xs, even with something as simple as addition.
 
Physics news on Phys.org
The mean of a sum is the sum of the means. The terms in the sum do not have to be independent or have the same distribution.
 
mathman said:
The mean of a sum is the sum of the means. The terms in the sum do not have to be independent or have the same distribution.

Awesome, I wasn't sure. Thanks for clearing that up!
 
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top