Showing stability under scaling and additivity of distriubtions

  • Thread starter Thread starter stukbv
  • Start date Start date
  • Tags Tags
    Scaling Stability
Click For Summary

Homework Help Overview

The discussion revolves around properties of gamma distributions, specifically focusing on the stability under scaling and additivity of independent gamma random variables. The original poster seeks to demonstrate that if X and Y are independent gamma-distributed random variables, then their sum and scaled versions also follow specific gamma distributions.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to use moment generating functions (mgfs) to prove the additivity of independent gamma distributions. They question whether their approach is sufficient and seek clarification on how to handle the scaling property.
  • Participants discuss the proper notation for mgfs and explore the relationship between the density functions of the random variables and their scaled versions.
  • There is a suggestion to derive the density function of the scaled variable from first principles, prompting further exploration of the cumulative distribution function.

Discussion Status

The discussion is active, with participants providing insights and clarifications on the original poster's attempts. Some guidance has been offered regarding the notation and the derivation of density functions, but there is no explicit consensus on the completeness of the original poster's solution.

Contextual Notes

Participants are navigating through the definitions and properties of gamma distributions, and there are indications of potential confusion regarding notation and derivation processes. The original poster's attempts are framed within the context of homework constraints, which may limit the depth of exploration.

stukbv
Messages
112
Reaction score
0

Homework Statement


I need to show that if X ~ r(a1,B) Y ~ r(a2,b) where r means gamma distribution then if X and Y are independent
i) X+Y ~ r(a1+a2,B)
ii) cX ~ r(a1,cB)


Homework Equations





The Attempt at a Solution



i) i use the mgfs of x and y and ended up with mgf(x+y) = (1/1-Bt)^(a1 + a2 )
I am told this is enough to prove i, is this correct or what could i say to make it better?

ii) i am a bit stuck on this one, i mean, would you just put c in front of all the x's in the pdf and then re-derive the mgf? or is there something extra?

thanks
 
Physics news on Phys.org
stukbv said:

Homework Statement


I need to show that if X ~ r(a1,B) Y ~ r(a2,b) where r means gamma distribution then if X and Y are independent
i) X+Y ~ r(a1+a2,B)
ii) cX ~ r(a1,cB)


Homework Equations





The Attempt at a Solution



i) i use the mgfs of x and y and ended up with mgf(x+y) = (1/1-Bt)^(a1 + a2 )
I am told this is enough to prove i, is this correct or what could i say to make it better?

ii) i am a bit stuck on this one, i mean, would you just put c in front of all the x's in the pdf and then re-derive the mgf? or is there something extra?

thanks

I suppose your "i" is supposed to be "I", rather than the square root of -1? Anyway, your result in (i) would be OK if you wrote it properly, with brackets to make things clear. That is, instead of writing (1/1-Bt)^(a1+a2)---which equals [1 - Bt]^(a1+a2)---you should write 1/(1-Bt)^(a1+a2), or (1 - Bt)^(-a1-a2) or (1-Bt)^{-(a1+a2)}. As to (ii): what is the problem? If f(x) is the density function of a random variable X, what is the density function of Y = c*X for a constant c? Alternatively, if F(x) is the (cumulative) distribution function of X, that is, P{X >= x} = F(x), then what is the cumulative distribution of Y = c*X? Then you can differentiate the distribution to get the density.

RGV
 
Is this where I say X = Y/c so then i put into fx y/c in place of all x's and then multiply by 1/c to get fY?
 
If F(x) = P{X <= x} and Y = c*X (with c > 0 a constant) then P{Y <= y} = P{c*X <= y} = P{X <] y/c} = F(y/c). The density of Y is g(y) = (d/dy)F(y/c) = f(y/c)/c, where f(x) = probability density of X. Alternatively: g(y)*dy = P{y < Y < y+dy} = P{y < c*X < y+dy} = P{y/c < X < y/c + dy/c} = f(y/c)*dy/c, so g(y) = f(y/c)/c. So, the answer to your question is YES, but I much prefer to get it from first principles.

RGV
 
Ok so now I have 1/c ( 1/(r(a)B^a) * (y/c)^a-1 .e^(-y/cB)
Is that right ?
 
I don't know. You have all the formulas you need.

RGV
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 39 ·
2
Replies
39
Views
7K