MHB How Do I Derive the Distribution of 2θΣx_i for Independent Random Variables?

AI Thread Summary
The discussion focuses on deriving the distribution of 2θΣx_i for independent random variables, specifically showing that 2θX follows a chi-squared distribution with 1 degree of freedom. The probability density function (pdf) for Y=2θX is confirmed to be that of a chi-squared distribution. It is established that for n independent random variables X_i with the same pdf, 2θΣx_i will have a chi-squared distribution with n degrees of freedom. The moment-generating function is suggested as an effective method for determining the distribution of the sum of these independent variables. The conversation emphasizes the utility of the moment-generating function in solving the problem.
SupLem
Messages
3
Reaction score
0
We have a r.v. X with p.d.f. = sqrt(θ/πx)*exp(-xθ) , x>0 and θ a positive parameter.
We are required to show that 2 θX has a x^2 distribution with 1 d.f. and deduce that, if x_1,……,x_n are independent r.v. with this p.d.f., then 2θ∑_(i=1)^n▒x_ι has a chi-squared distribution with n degrees of freedom.
Using transformation (y=2θΧ) I found the pdf of y=1/sqrt (2π) *y^(-1/2)*e^(-y/2). How do I find the distribution of 2θ∑_(i=1)^n▒x_ι ? Do I need to find the likelihood function (which contains ∑_(i=1)^n▒x_ι ) first ? How do I recognise the d.f. of this distribution (Is it n because it involves x_1,……,x_n,i.e. n r.v.?

(since i couldn't get the graphics right above, I am also adding a screenshot of my word document in order to view). Thanks!View attachment 3491
 

Attachments

  • Capture_8_Nov.JPG
    Capture_8_Nov.JPG
    56.8 KB · Views: 105
Physics news on Phys.org
As you've found for the pdf $f_Y$ of the r.v $Y=2\theta X$:
$$f_Y(y) = \frac{1}{\sqrt{2\pi}} y^{-\frac{1}{2}} e^{-\frac{y}{2}}$$
which is indeed the pdf of the $\chi^2$-distribution with 1 degree of freedom. Next, we have given that $X_1,\ldots,X_n$ are independen r.v's with the same pdf as $X$.

First note that $2\theta X_i$ for $i=1,\ldots,n$ has the same distribution as $Y$, in other words to solve the question you have to find the distribution of $n$ independent r.v's $X_i$ with $X_i \sim \chi^2(1)$.

Do you need to solve this with the transformation theorem? Because using the moment-generating function would lead easily to a solution (because of the independency).
 
Thank you very much for your response. Could you, please, elaborate, on how using the moment generating function would help us in this respect ( i.e. finding the 2θΣx(i=1 to n) distribution?
 
SupLem said:
Thank you very much for your response. Could you, please, elaborate, on how using the moment generating function would help us in this respect ( i.e. finding the 2θΣx(i=1 to n) distribution?

It satisfies to use the moment generating function as it determines the distribution completely.

Denote the moment generating function of $2\theta X_i \sim \chi^2(1)$ as $M_{2\theta X_i}(t)$ which is known for $i=1,\ldots,n$. Due to the independency we have
$$M_ {2\theta \sum_{i=1}^{n} X_i}(t) = \prod_{i=1}^{n} M_{2\theta X_i}(t)$$
 

Similar threads

Back
Top