Conditional PDF with multiple random variables

Click For Summary

Homework Help Overview

The discussion revolves around finding the conditional probability density function (PDF) f(d|s) for the random variable D, defined as D = (L + E) / S, where L, E, and S are mutually independent normally distributed random variables. Participants are exploring the implications of having multiple random variables and the relationships between them.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the treatment of "s" as a constant in the context of the conditional PDF and question the implications of adding L and E as normally distributed variables. There is an exploration of the joint probability distribution f(d,s) and the need to derive it for further analysis. Some participants express confusion regarding the number of variables involved and the definitions of random variables and conditional probabilities.

Discussion Status

The discussion is active, with participants providing guidance on definitions and approaches to finding the joint PDF. There is acknowledgment of misunderstandings and clarifications regarding the properties of normal distributions and the relationships between the random variables. Multiple interpretations of the problem are being explored, and participants are encouraged to revisit foundational concepts.

Contextual Notes

Participants note the complexity introduced by having four variables and the specific conditions of mutual independence and normal distribution for L, E, and S. There is a recognition of the need for clarity in definitions and the potential for confusion when applying conditional probability concepts.

SIE_tp
Messages
2
Reaction score
0

Homework Statement



D = (L + E) / S
Where L, E, and S are mutually independent random variables that are each normally distributed.

I need to find (symbolically), the conditional PDF f(d|s).


Homework Equations





The Attempt at a Solution



Not sure what to do with so many variables... I'm guessing that I can treat "s" as a constant since it's "given" for the conditional PDF. I also know that adding L + E will result in a normally distributed random variable. So D is also a random variable, right?

I tried to use Bayes' Rule and also the definition of conditional probability - didn't help.

I would be willing to bet that I need to integrate something...

THANK YOU for any guidance you can provide!
 
Physics news on Phys.org
Hello,

You seem to be making a generally confused impression, e.g.

I'm guessing that I can treat "s" as a constant since it's "given" for the conditional PDF
You shouldn't guess, of course

I also know that adding L + E will result in a normally distributed random variable.
Why? Is anything at all given about L and E? What if they were both 0, then the sum is 0 and that's hardly a normal distribution.

So D is also a random variable, right?
I don't know how this follows from the previous, but it is true, but it should be more obvious

Of course, I don't mean to come across as belittling: my point is to point out why I consider you to be generally confused and I want to help. When I'm generally confused, I find it always helps to go back to the definitions. For example, go back to the definition of a random variable: verify and understand why D indeed is a random variable.
Then look up the definition of something like f(d|s). Most probably your book will have defined it as
[tex]f(d|s) = \frac{f(d,s)}{f(s)}[/tex]
so by definition we need f(d,s) and f(s) [or you could try to use something like Bayes' rule, as you did, but that also uses things of the form f(x|y) so you can guess that that won't simplify matters]

Now the question has come to: how do I get f(d,s) and f(s). Your integration radar is indeed correct: it is the clue to understanding that we only need to get f(d,s). Is it obvious how f(s) follows from f(d,s)?

Now the problem has been reduced to "what is the joint probability distribution of D and S?".

You most probably saw a method to get a probability distribution of functions of random variables. If this doesn't ring a bell, reread your course and give it some thought :)

I'll be here if you have more questions.
 
I appreciate your attempt to help, but I did indeed find your comments a bit condescending. You seemed to be aware of your tone, but I wanted to confirm.

I know the definition of conditional probability, of course, AND attempted to apply it - mentioned in my original post.

If this problem had two random variables, I would be good to go. I am confused because there are FOUR variables, D being dependent upon L, E, and S (recall - L, E, S given as having a normal distribution).

To answer your question - what I meant was that if I add f(l) and f(e) as normally distributed RVs, I will get another normally distributed RV. Sorry for being unclear.

If I use the standard definition of conditional probability for PDFs, then yes, I need to find the joint PDF of f(d,s). I don't know how to do that from D = (L + E)/S. I already have f(s) - normally distributed RV, so I do not need to derive it by integrating f(d,s) over d.

I welcome any other advice you have.
 
I already have f(s) - normally distributed RV, so I do not need to derive it by integrating f(d,s) over d.
Good point, I erred there.

Also, I need to apologize for my line: "Why? Is anything at all given about L and E? What if they were both 0, then the sum is 0 and that's hardly a normal distribution."
I had overlooked you stating "Where L, E, and S are mutually independent random variables that are each normally distributed."!

Anyway, back to the problem:

If this problem had two random variables, I would be good to go. I am confused because there are FOUR variables, D being dependent upon L, E, and S.
Have you seen something of the following in your course: given two random variables X & Y with density f(X,Y), then if g and h are functions such that g(X,Y) and h(X,Y) are again random variables, we can express the joint probability distribution of g(X,Y) and h(X,Y) in terms of f (the formula uses the Jacobian)?
If this sounds very unfamiliar, then probably a less general approach will suffice, but I wanted to check this first.
 
Your "guess" is correct, and is easy to justify, depending on how you define conditional densities. Take the case of a trivariate density f(x,y,z). Suppose we *define* the conditional density f(x,y|z) so that f(x,y|z) dx dy =lim_{h-->0} Pr{x<X<x+dx,y<Y<y+dy|
z<Z<z+h}. Then, indeed, we have that for h going to zero we have that f(x,y|z) = C*f(x,y,z), where C depends only on z (is a constant as far as x and y are concerned). Basically, C is a normalization constant that ensures the x,y integral of the conditional
PDF is 1. Of course, C = 1/f_Z(z), where f_Z is the marginal density of Z.

RGV
 
Last edited:

Similar threads

Replies
7
Views
2K
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
9
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K