MHB How Do I Derive the Distribution of 2θΣx_i for Independent Random Variables?

Click For Summary
The discussion focuses on deriving the distribution of 2θΣx_i for independent random variables, specifically showing that 2θX follows a chi-squared distribution with 1 degree of freedom. The probability density function (pdf) for Y=2θX is confirmed to be that of a chi-squared distribution. It is established that for n independent random variables X_i with the same pdf, 2θΣx_i will have a chi-squared distribution with n degrees of freedom. The moment-generating function is suggested as an effective method for determining the distribution of the sum of these independent variables. The conversation emphasizes the utility of the moment-generating function in solving the problem.
SupLem
Messages
3
Reaction score
0
We have a r.v. X with p.d.f. = sqrt(θ/πx)*exp(-xθ) , x>0 and θ a positive parameter.
We are required to show that 2 θX has a x^2 distribution with 1 d.f. and deduce that, if x_1,……,x_n are independent r.v. with this p.d.f., then 2θ∑_(i=1)^n▒x_ι has a chi-squared distribution with n degrees of freedom.
Using transformation (y=2θΧ) I found the pdf of y=1/sqrt (2π) *y^(-1/2)*e^(-y/2). How do I find the distribution of 2θ∑_(i=1)^n▒x_ι ? Do I need to find the likelihood function (which contains ∑_(i=1)^n▒x_ι ) first ? How do I recognise the d.f. of this distribution (Is it n because it involves x_1,……,x_n,i.e. n r.v.?

(since i couldn't get the graphics right above, I am also adding a screenshot of my word document in order to view). Thanks!View attachment 3491
 

Attachments

  • Capture_8_Nov.JPG
    Capture_8_Nov.JPG
    56.8 KB · Views: 109
Physics news on Phys.org
As you've found for the pdf $f_Y$ of the r.v $Y=2\theta X$:
$$f_Y(y) = \frac{1}{\sqrt{2\pi}} y^{-\frac{1}{2}} e^{-\frac{y}{2}}$$
which is indeed the pdf of the $\chi^2$-distribution with 1 degree of freedom. Next, we have given that $X_1,\ldots,X_n$ are independen r.v's with the same pdf as $X$.

First note that $2\theta X_i$ for $i=1,\ldots,n$ has the same distribution as $Y$, in other words to solve the question you have to find the distribution of $n$ independent r.v's $X_i$ with $X_i \sim \chi^2(1)$.

Do you need to solve this with the transformation theorem? Because using the moment-generating function would lead easily to a solution (because of the independency).
 
Thank you very much for your response. Could you, please, elaborate, on how using the moment generating function would help us in this respect ( i.e. finding the 2θΣx(i=1 to n) distribution?
 
SupLem said:
Thank you very much for your response. Could you, please, elaborate, on how using the moment generating function would help us in this respect ( i.e. finding the 2θΣx(i=1 to n) distribution?

It satisfies to use the moment generating function as it determines the distribution completely.

Denote the moment generating function of $2\theta X_i \sim \chi^2(1)$ as $M_{2\theta X_i}(t)$ which is known for $i=1,\ldots,n$. Due to the independency we have
$$M_ {2\theta \sum_{i=1}^{n} X_i}(t) = \prod_{i=1}^{n} M_{2\theta X_i}(t)$$
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K