MHB How Do I Derive the Distribution of 2θΣx_i for Independent Random Variables?

SupLem
Messages
3
Reaction score
0
We have a r.v. X with p.d.f. = sqrt(θ/πx)*exp(-xθ) , x>0 and θ a positive parameter.
We are required to show that 2 θX has a x^2 distribution with 1 d.f. and deduce that, if x_1,……,x_n are independent r.v. with this p.d.f., then 2θ∑_(i=1)^n▒x_ι has a chi-squared distribution with n degrees of freedom.
Using transformation (y=2θΧ) I found the pdf of y=1/sqrt (2π) *y^(-1/2)*e^(-y/2). How do I find the distribution of 2θ∑_(i=1)^n▒x_ι ? Do I need to find the likelihood function (which contains ∑_(i=1)^n▒x_ι ) first ? How do I recognise the d.f. of this distribution (Is it n because it involves x_1,……,x_n,i.e. n r.v.?

(since i couldn't get the graphics right above, I am also adding a screenshot of my word document in order to view). Thanks!View attachment 3491
 

Attachments

  • Capture_8_Nov.JPG
    Capture_8_Nov.JPG
    56.8 KB · Views: 104
Physics news on Phys.org
As you've found for the pdf $f_Y$ of the r.v $Y=2\theta X$:
$$f_Y(y) = \frac{1}{\sqrt{2\pi}} y^{-\frac{1}{2}} e^{-\frac{y}{2}}$$
which is indeed the pdf of the $\chi^2$-distribution with 1 degree of freedom. Next, we have given that $X_1,\ldots,X_n$ are independen r.v's with the same pdf as $X$.

First note that $2\theta X_i$ for $i=1,\ldots,n$ has the same distribution as $Y$, in other words to solve the question you have to find the distribution of $n$ independent r.v's $X_i$ with $X_i \sim \chi^2(1)$.

Do you need to solve this with the transformation theorem? Because using the moment-generating function would lead easily to a solution (because of the independency).
 
Thank you very much for your response. Could you, please, elaborate, on how using the moment generating function would help us in this respect ( i.e. finding the 2θΣx(i=1 to n) distribution?
 
SupLem said:
Thank you very much for your response. Could you, please, elaborate, on how using the moment generating function would help us in this respect ( i.e. finding the 2θΣx(i=1 to n) distribution?

It satisfies to use the moment generating function as it determines the distribution completely.

Denote the moment generating function of $2\theta X_i \sim \chi^2(1)$ as $M_{2\theta X_i}(t)$ which is known for $i=1,\ldots,n$. Due to the independency we have
$$M_ {2\theta \sum_{i=1}^{n} X_i}(t) = \prod_{i=1}^{n} M_{2\theta X_i}(t)$$
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.

Similar threads

Back
Top