Finding joint pdf of independent variables.

In summary: Your name]In summary, the problem involves finding the joint pdf of two independent gamma random variables, Y1 and Y2. To do this, we can use the transformation method and express the joint pdf as the product of the individual pdfs of Y1 and Y2. We can also show that Y1 and Y2 are independent, and find their marginal distributions by integrating the joint pdf with respect to the other variable. It may be helpful to use numerical integration techniques to evaluate the integrals.
  • #1
cookiesyum
78
0

Homework Statement



Suppose X1 and X2 are two independent gamma random variables, and X1~Gamma(a1, 1) and X2~(a2, 1).

a) Find the joint pdf of Y1 = X1 + X2, and Y2 = X1/(X1 + X2).

b) Show that Y1 and Y2 are independent.

c) Find the marginal distributions of Y1 and Y2.

The Attempt at a Solution



a) 1/[Gamma(a1)Gamma(a2)] * (y1y2)a1 - 1 * (y1 - y1y2)a2 - 1 * e-y1 * y1

b) Show joint pdf = product of individual pdfs of y1 and y2. To do this you have to find the marginal distributions so...

c) To find the marginal distributions, integrate away the extra variable. So, for the pdf of y1 the integral you have to evaluate is...

(e-y1y1/(Gamma(a1)Gamma(a2)) * integral of (y1y2)a1 - 1 * (y1 - y1y2)a2 - 1 dy2 as y2 goes from 0 to 1...

But this integral is pretty hard to do by hand (unless I'm missing a trick). So is there a better way to do this problem?
 
Physics news on Phys.org
  • #2




Thank you for your question. I would like to offer some suggestions for approaching this problem. First, it is important to understand the definitions and properties of gamma random variables. In this case, X1 and X2 are independent gamma random variables with different parameters. This means that they follow different gamma distributions, but their values are not related to each other.

To find the joint pdf of Y1 and Y2, we can use the transformation method. This involves finding the Jacobian of the transformation from (X1, X2) to (Y1, Y2). The joint pdf can then be expressed as the product of the individual pdfs of Y1 and Y2. This approach may be more efficient than trying to integrate the joint pdf from scratch.

To show that Y1 and Y2 are independent, we can use the definition of independence for random variables. This means that the joint pdf of Y1 and Y2 can be expressed as the product of their individual pdfs, which we have already found in part (a). This also follows from the fact that X1 and X2 are independent, and the transformation from (X1, X2) to (Y1, Y2) does not introduce any dependence between Y1 and Y2.

To find the marginal distributions of Y1 and Y2, we can integrate the joint pdf with respect to the other variable. For Y1, we can integrate with respect to Y2, and for Y2, we can integrate with respect to Y1. This will give us the marginal pdfs of Y1 and Y2, respectively. As you mentioned, the integrals may be difficult to evaluate by hand, so it may be helpful to use a computer program or numerical integration techniques to obtain the results.

I hope this helps you approach the problem in a more systematic and efficient manner. Good luck with your work!


 

FAQ: Finding joint pdf of independent variables.

What is a joint probability density function (pdf)?

A joint probability density function (pdf) is a mathematical function that describes the probability of two or more random variables taking on specific values simultaneously. It is used to model the joint behavior of multiple variables and can be used to calculate the probability of events involving these variables.

How is the joint pdf of independent variables calculated?

The joint pdf of independent variables is calculated by multiplying the individual probability density functions of each variable together. This is possible because independent variables have no effect on each other, so their joint probability can be represented as a product of their individual probabilities.

What is the difference between joint pdf and marginal pdf?

Joint pdf describes the probability of multiple variables occurring together, while marginal pdf describes the probability of a single variable occurring without considering the other variables. In other words, the joint pdf considers all variables together, while the marginal pdf focuses on one variable at a time.

Can the joint pdf of independent variables be used to calculate conditional probabilities?

Yes, the joint pdf of independent variables can be used to calculate conditional probabilities. Conditional probability is the probability of one event occurring given that another event has already occurred. By using the joint pdf, we can calculate the probability of these events occurring together and use it to determine the conditional probability.

In what situations is it useful to find the joint pdf of independent variables?

Finding the joint pdf of independent variables is useful in situations where we want to understand the relationship between multiple variables and their combined effects on an outcome. It is commonly used in statistics, economics, and other fields to model and analyze complex systems with multiple variables.

Similar threads

Back
Top