I The third central moment of a sum of two independent random variables

Ad VanderVen
Messages
169
Reaction score
13
TL;DR Summary
Is it true that in probability theory the third central moment of a sum of two independent random variables is equal to the sum of the third central moments of the two separate variables?
Is it true that when X and Y are independent,

E ({X+Y}3) = E (X3)+E(Y3)?
 
Last edited:
Physics news on Phys.org
This is just linearity of the expectation. You are assuming X and Y have expectation 0 and are independent. Develop (X+Y)^3, use linearity of E[.], then use independence and centrality to get E[X^2Y] = E[X^2]E[Y]=0 and E[XY^2] = E[X]E[Y^2]=0.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top