Distribution of sum of discrete random variableby Tosh5457 Tags: discrete, distribution, random, variable 

#1
Mar3012, 09:46 AM

P: 223

Edit: I have to think more about this, I'll post later.




#2
Mar3012, 06:56 PM

P: 4,570





#3
Mar3112, 07:50 AM

P: 223

Ok, my problem is about poker tourneys.
Consider the random variable Y, which is the sum of N random variables X: [tex]Y=X+X+...+X=NX[/tex] X is the random variable that assigns a prize value to each inthemoney position, and assigns 1 to outofmoney positions. So: [tex]X(position)=\left\{\begin{matrix}1, position = "OTM" \\w_{1}, position = "1st" \\w_{2}, position = "2nd" \\... \\w_{n}, position = "nth" \end{matrix}\right.[/tex] OTM means outofmoney, and here n is the number of inthemoney positions. w1, w2,..., wn are constants of course. The probability mass function of X is: [tex]f(x)=\left\{\begin{matrix} \beta_{1}, x = 1 \\ \beta_{2}, x = w_{1} \\ \beta_{3}, x = w_{2} \\ ... \\ \beta_{n+1}, x = w_{n} \end{matrix}\right.[/tex] What I want to know is the probability mass function of Y (which represents the profit/loss of N tourneys). I could find this by using the convolution theorem, but that's where my problem arises. As I understood it, Y needs to depend on a variable y, so: [tex]Y(y)=X(x)+X(x)+...+X(x)=NX(x)[/tex] But I don't know how to define y... This is why I can't use the convolution theorem, because I didn't really understand this part. 



#4
Mar3112, 04:28 PM

Sci Advisor
P: 5,937

Distribution of sum of discrete random variable
Assuming all the X's are independent, it would be better to use the characteristic function (Fourier transform of distribution). Then the characteristic function for Y is the Nth power of the char. function for X. To get the distribution function for Y, take the inverse transform.




#5
Mar3112, 05:50 PM

P: 223

Anyway, shouldn't it be easy by convolution? I think that I'm only missing something very basic... 



#6
Mar3112, 09:28 PM

P: 4,570





#7
Apr112, 03:24 PM

Sci Advisor
P: 5,937





#8
Apr212, 06:26 PM

P: 223

About the convolution, I still don't understand some things... 1st problem: Definition of convolution for discrete r.v: [tex]f_{Z}(z)=\sum_{x=\inf}^{x=+inf}f_{Y}(zx)f_{X}(x)[/tex] where x is an integer. The problem here is that [tex]f_{X}(x)[/tex] may be always 0 in my example (except for x = 1), since there may be no integer prizes... And if I defined the r.v. X slightly differently (not assigning that 1 value) that function would always be 0 and there would be no probablity mass function for Z? Something's not right here... 



#9
Apr212, 10:15 PM

P: 4,570





#10
Apr312, 06:57 AM

P: 223

Ok I'll write everything again with the pmf included, so everything will be on the same post.
Definition of convolution of pmf's for discrete random variables X and Y: [tex]f_{Z}(z)=\sum_{x=inf}^{x=+inf}f_{Y}(zx)f_{X}(x)[/tex] where x is an integer. PMF of X and Y (they have the same distribution): [tex]f(X=x)=\left\{\begin{matrix}a_{1}, x=1 \\ a_{2}, x=w_{1} \\ a_{3}, x=w_{2} \\ ... \\ a_{n}, x=w_{n1} \end{matrix}\right.[/tex] where w1, w2, ..., wn1 are real constants. Problem: In the series, [tex]f_{X}(x)[/tex] may be always 0 except for 1, since w1, w2, ..., wn1 in general won't be integers. 



#11
Apr312, 10:24 PM

P: 4,570

Are you wondering about the mapping procedure if your values that are mapped to probabilities are not integers? 



#12
Apr412, 07:51 AM

P: 223





#13
Apr412, 02:36 PM

Sci Advisor
P: 3,173

[tex] Y(y) = X_1(x_1) + ...+ X_N(x_N) [/tex] and you can't combine these terms. The variable y represents a particular value of the random variable Y. It just like saying "Let W be the random variable that represents the value of the face on a roll of a fair die". W(w) would be the particular event that the face value was w. (e.g. W(2) says the face that came up was 2. To find the probability of particular value of Y, such as y = $40, you must sum the probabilities of all combinations of values [itex] x_i [/itex] for the [itex] X_i [/itex] which add up $40. So you consider all possible values of the [itex] x_i [/itex] that meet that requirement. That is essentially what "convolving" random variables means. 



#14
Apr412, 08:05 PM

P: 4,570

The best way to do this is to do it for the first two random variables and then simplify and repeat to incorporate the rest of them. Basically you should have n x m outputs for your convoluted distribution if n is the number of outputs for your starting and m is the number of outputs for the one you are convolving with. When you do this repeatedly m because the number of mappings in the last distribution you calculated through convolution. 



#15
Apr512, 07:33 PM

P: 223

Thanks for the replies
For example, Y = w1 + w1 + w1 + ... + w1 = N*w1 is a possibility for Y. If I sum the probabilities of X=w1 N times I can easily get a number higher than 1. This is what I understood, please tell me if I'm wrong: For simplicity let's say Y = X1 + X2, and each of these X's can only be w1 and w2. Then Y will be like this (eta is a dummy variable, didn't even bother to write it in each value of Y): [tex]Y(\eta )=\left\{\begin{matrix} \\ w_{1}+w_{1} \\ w_{1}+w_{2} \\ w_{2}+w_{1} \\ w_{2}+w_{2} \end{matrix}\right.[/tex] In general, for the sum of 2 random variables, it will have n x m values, where n is the number of possible values X1 can have, and m is the number of possible values X2 can have (like chiro said). Now, if I attribute a probability of 0.7 to w2 and 0.3 to w1 for example, the probability of Y=w2+w2 would be 1.4, in my understanding. 



#16
Apr512, 07:43 PM

P: 4,570

Remember that it won't be N*w1: instead you will have to do a convolution for the first two random variables, then do a convolution with this calculated PDF with the next variable and then you keep doing this until you have a PDF for n1 random variables with the nth random variable. If you expand this out, you'll see that it is a lot more involved than the N*w1 behaviour that you are implying: it doesn't work like that. Convolution simply represents a way to multiply frequencies together of two functions and this idea is not only for probability but it's used for many areas of signal analysis and other areas of applied mathematics. 


Register to reply 
Related Discussions  
Discrete Random Variable  Calculus & Beyond Homework  9  
discrete random variable  Precalculus Mathematics Homework  6  
Finding the marginal distribution of a random variable w/ a random variable parameter  Set Theory, Logic, Probability, Statistics  3  
Finding the marginal distribution of a random variable w/ a random variable parameter  Calculus & Beyond Homework  0  
Logarithm of a discrete random variable  Set Theory, Logic, Probability, Statistics  37 