Characteristic function of Sum of Random Variables

In summary, the conditional distribution of X given X+W is a geometric distribution with parameter p and probability mass functionWhat is the characteristic function of A= X-2W+3Y ?The characteristic function of A is p/(1-(1-p)exp(it))
  • #1
cutesteph
63
0

Homework Statement


Let X,W,Y be iid with a common geometric density f_x(x)= p(1-p)^x for x nonnegative integer
and p is in the interval (0,1)

What is the characteristic function of A= X-2W+3Y ?

Determine the family of the conditional distribution of X given X+W?

Homework Equations


the characteristic function of the geomtric series is p/[1-(1-p)exp(it)]


The Attempt at a Solution


The characteristic function of a sum of random variables is the product of the individual characteristic functions.

So I need to find the characteristic function of X , -2W and 3Y and multiply them together?
 
Physics news on Phys.org
  • #2
Sounds right to me. (It requires them to be independent, which they are.)
 
  • #3
This may sound like a stupid question, but how do I get the density of a discrete random variable when multiplied by a constant? Do I simply subsistute the value in for the variable. Like if I has a binomial random variable X~ Bin (n,p) (I choose n) p^i (1-p)^n-1 for i= 0,1,2,...,n how would I find the density of 5X?
 
  • #4
cutesteph said:
This may sound like a stupid question, but how do I get the density of a discrete random variable when multiplied by a constant? Do I simply subsistute the value in for the variable. Like if I has a binomial random variable X~ Bin (n,p) (I choose n) p^i (1-p)^n-1 for i= 0,1,2,...,n how would I find the density of 5X?
Maybe, but I wouldn't assume that. Start with the def of characteristic function and try to work it out from first principles. Shouldn't be hard.
 
  • #5
cutesteph said:
This may sound like a stupid question, but how do I get the density of a discrete random variable when multiplied by a constant? Do I simply subsistute the value in for the variable. Like if I has a binomial random variable X~ Bin (n,p) (I choose n) p^i (1-p)^n-1 for i= 0,1,2,...,n how would I find the density of 5X?

Of course, discrete random variables do not have densities, but the do have probability mass functions and (cumulative) distribution functions. If p(k), k in K, is a discrete pmf (so that P{X = k} = p(k)) then for Y = 5X, the pmf is P{Y = j} = P{5X=j} = P{X=j/5} = p(j/5) (but only for values of j such that j/5 is in K).

RGV
 
  • #6
So for example if I wanted 2X where X ~Bin(5,1/2) ie (5 choose x)(1/2)^x (1/2)^5-x for x =0,1,2,3,4,5. Then 2X would be Bin(2,1/2) (2 choose x)(1/2)^ (1/2)^2-x for x=0,1,2?

How would I find a joint probably mass function of two binomials? Let's say was was Y= 2X and the other was Z=3x and I want the joint pmf of (Y,Z)?
 
Last edited:
  • #7
cutesteph said:
So for example if I wanted 2X where X ~Bin(5,1/2) ie (5 choose x)(1/2)^x (1/2)^5-x for x =0,1,2,3,4,5. Then 2X would be Bin(2,1/2) (2 choose x)(1/2)^ (1/2)^2-x for x=0,1,2?

How would I find a joint probably mass function of two binomials? Let's say was was Y= X+2X and the other was Z=-2X+3X and I want the joint pmf of (Y,Z)?

No. If X ~ Bin(5,1/2) and Y = 2X, we have P{Y = j} = P{X = j/2} = C(5,j/2)/2^5 for j = 0,2,4,6,8,10, and P{Y = j} = 0 for all other j.

RGV
 
  • #8
Also would the characteristic function of A in my orginal question would be phi(t) = [p/(1-(1-p)exp(it))] [p(1-p)/(1-p-exp(it))] [p/(1-exp(it)(1-p)^3] .
 
  • #9
Ray Vickson said:
No. If X ~ Bin(5,1/2) and Y = 2X, we have P{Y = j} = P{X = j/2} = C(5,j/2)/2^5 for j = 0,2,4,6,8,10, and P{Y = j} = 0 for all other j.

RGV

So what happens when I had binomials with different indexes? like X+Y in the quoted example?
 
  • #10
cutesteph said:
So what happens when I had binomials with different indexes? like X+Y in the quoted example?

You know the basic formulas for probabilities of a sum of independent random variables, and you know the probability mass functions of the individual random variables. Just put it altogether yourself.

RGV
 
  • #11
Ray Vickson said:
You know the basic formulas for probabilities of a sum of independent random variables, and you know the probability mass functions of the individual random variables. Just put it altogether yourself.

RGV

Got it. Thanks.



Determine the family of the conditional distribution of X given X+W?

X+W is a negative binomial(2,p) since the product of the moment generating function X and W shares the same moment generating function with a binomial (2,P) , which we can do since they are independent and we know that moment generating functions are unique.

We know the negative binomial counts the number of failures proceeding the 2 sucess (in this case) in a sequence of bernoulli trials.

The geometric distribution counts the number of bernoulli trials to get one sucess.


What does it mean by family of the conditional distribution?
 
  • #12
cutesteph said:
What does it mean by family of the conditional distribution?
I would assume it's asking whether it's binomial or whatever.
It would probably be useful to think about the joint distribution of X and X+W.
 

1. What is the characteristic function of a sum of random variables?

The characteristic function of a sum of random variables is a mathematical function that describes the distribution of the sum of two or more random variables. It is defined as the expected value of the complex exponential function raised to the power of the sum of the random variables.

2. How is the characteristic function of a sum of random variables calculated?

The characteristic function of a sum of random variables can be calculated by taking the product of the individual characteristic functions of the random variables. In other words, if X and Y are two random variables with characteristic functions φX(t) and φY(t), then the characteristic function of their sum, Z = X + Y, is φZ(t) = φX(t)φY(t).

3. What is the importance of the characteristic function in probability theory?

The characteristic function is an important tool in probability theory because it allows us to study the properties of random variables and their sums without having to directly work with their probability distributions. It also provides a way to easily calculate moments and other statistical properties of a random variable.

4. How does the characteristic function of a sum of random variables relate to the central limit theorem?

The central limit theorem states that the sum of a large number of independent and identically distributed random variables tends towards a normal distribution. The characteristic function of a sum of random variables plays a crucial role in proving this theorem, as it allows us to show that the characteristic function of the sum converges to the characteristic function of a normal distribution.

5. Can the characteristic function of a sum of random variables be used to find the distribution of the sum?

Yes, the characteristic function of a sum of random variables can be used to find the distribution of the sum through the inverse Fourier transform. This allows us to determine the probability distribution of the sum without having to use complicated mathematical techniques, making it a useful tool in statistical analysis and modeling.

Similar threads

Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
732
  • Calculus and Beyond Homework Help
Replies
8
Views
672
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
995
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
909
  • Math POTW for Graduate Students
Replies
4
Views
490
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
926
Back
Top