MHB Simplifying a Messy Question - Any Help Appreciated!

  • Thread starter Thread starter nacho-man
  • Start date Start date
  • Tags Tags
    Simplifying
nacho-man
Messages
166
Reaction score
0
Please refer to the attached image.

Is there a way to simplify this question? It looks really messy but I have a feeling there is some nifty way around it. Surely they don't want us to integrate that entire function.

I also have no idea about part d)
how can some values of lambda result in X and Y being independent, and others not?

Any help appreciated.
Thanks in advnance
 

Attachments

  • mathq2.jpg
    mathq2.jpg
    16.9 KB · Views: 94
Physics news on Phys.org
nacho said:
Please refer to the attached image.

Is there a way to simplify this question? It looks really messy but I have a feeling there is some nifty way around it. Surely they don't want us to integrate that entire function.

I also have no idea about part d)
how can some values of lambda result in X and Y being independent, and others not?

Any help appreciated.
Thanks in advnance

Before to try some 'brute force approach' my be it is useful to observe that the given joint distribution...

$\displaystyle f(x,y) = \frac{1}{2\ \pi\ \lambda^{3}}\ e^{- \frac{1}{2\ \lambda^{4}}\ \{(x-\lambda)^{2} + (y-\lambda)^{2} - 2\ \sqrt{1-\lambda^{2}}\ (x-\lambda)\ (y-\lambda)\}}\ (1)$

... is a particular case of the bivariate normal distribution...

$\displaystyle f(x,y) = \frac{1}{2\ \pi\ \sigma_{x}\ \sigma_{y}\ \sqrt{1- \rho^{2}}}\ e^{- \frac{z}{2\ (1-\rho^{2})}}\ (2)$

... where...

$\displaystyle z = \frac{(x-\mu_{x})^{2}}{\sigma_{x}^{2}} + \frac{(y-\mu_{y})^{2}}{\sigma_{y}^{2}} - 2\ \frac{\rho\ (x - \mu_{x})\ (y-\mu_{y})}{\sigma_{x}\ \sigma_{y}}\ (3) $

... and $\displaystyle \mu_{x}= \mu_{y} = \sigma_{x} = \sigma_{y} = \sqrt{1-\rho^{2}} = \lambda$...

Kind regards

$\chi$ $\sigma$
 
thanks chi

I am unsure what to do with this information, how am I supposed to utilise it?

I have a feeling that you want me to use known properties of a bivariate distribution, and that the function will have the same properties, just with some transformations?

could also you suggest what I area I study in order to solve questions like these?
 
chisigma said:
Before to try some 'brute force approach' my be it is useful to observe that the given joint distribution...

$\displaystyle f(x,y) = \frac{1}{2\ \pi\ \lambda^{3}}\ e^{- \frac{1}{2\ \lambda^{4}}\ \{(x-\lambda)^{2} + (y-\lambda)^{2} - 2\ \sqrt{1-\lambda^{2}}\ (x-\lambda)\ (y-\lambda)\}}\ (1)$

... is a particular case of the bivariate normal distribution...

$\displaystyle f(x,y) = \frac{1}{2\ \pi\ \sigma_{x}\ \sigma_{y}\ \sqrt{1- \rho^{2}}}\ e^{- \frac{z}{2\ (1-\rho^{2})}}\ (2)$

... where...

$\displaystyle z = \frac{(x-\mu_{x})^{2}}{\sigma_{x}^{2}} + \frac{(y-\mu_{y})^{2}}{\sigma_{y}^{2}} - 2\ \frac{\rho\ (x - \mu_{x})\ (y-\mu_{y})}{\sigma_{x}\ \sigma_{y}}\ (3) $

... and $\displaystyle \mu_{x}= \mu_{y} = \sigma_{x} = \sigma_{y} = \sqrt{1-\rho^{2}} = \lambda$...

The main reason why I remember the normal bivariate distribution is that we can use its very comfortable properties...

Bivariate Normal Distribution -- from Wolfram MathWorld

Regarding the points a) we have that the marginal distribution function of the X is...

$\displaystyle f_{x} (x) = \int_{- \infty}^{+ \infty} f(x,y)\ dy = \frac{1}{\sigma_{x}\ \sqrt{2\ \pi}}\ e^{- \frac{(x - \mu_{x})^{2}}{2\ \sigma_{x}^{2}}} = \frac{1}{\lambda\ \sqrt{2\ \pi}}\ e^{- \frac{(x - \lambda )^{2}}{2\ \lambda^{2}}}\ (1)$

Now the point b) is direct consequence of (1)... why?...

Kind regards

$\chi$ $\sigma$
 
chisigma said:
The main reason why I remember the normal bivariate distribution is that we can use its very comfortable properties...

Bivariate Normal Distribution -- from Wolfram MathWorld

Regarding the points a) we have that the marginal distribution function of the X is...

$\displaystyle f_{x} (x) = \int_{- \infty}^{+ \infty} f(x,y)\ dy = \frac{1}{\sigma_{x}\ \sqrt{2\ \pi}}\ e^{- \frac{(x - \mu_{x})^{2}}{2\ \sigma_{x}^{2}}} = \frac{1}{\lambda\ \sqrt{2\ \pi}}\ e^{- \frac{(x - \lambda )^{2}}{2\ \lambda^{2}}}\ (1)$

Now the point b) is direct consequence of (1)... why?...

The knowledge of $\displaystyle f_{x} (x)$ and the following useful article...

http://mpdc.mae.cornell.edu/Courses/MAE714/biv-normal.pdf

... permits us to find the conditional distribution...

$\displaystyle f_{Y|X=x} (y) = \frac{f(x,y)}{f_{x}(x)} = \frac{1}{\sqrt{2\ \pi}\ \sigma_{y}\ \sqrt{1 - \rho^{2}}}\ e^{- \frac{1}{2\ \sigma_{y}^{2}\ (1-\rho^{2})}\ \{y - \mu_{y} - \rho\ \frac{\sigma{y}}{\sigma_{x}}\ (x-\mu_{x})\}^{2}} = \frac{1}{\sqrt{2\ \pi}\ \lambda^{3}}\ e^{- \frac{1}{2\ \lambda^{4}}\ \{y - 2\ \lambda - \lambda\ (x-\lambda)\}^{2}}\ (1)$

Now the (1) is a standard normal ditribution with mean $\displaystyle \mu_{y} + \rho\ \frac{\sigma_{y}}{\sigma_{x}}\ (x - \mu_{x})$ and variance $\displaystyle (1- \rho^{2})\ \sigma_{y}^{2}$ so that is... $\displaystyle E \{Y|X=x\} = \lambda + \sqrt{1 - \lambda^{2}}\ (x - \lambda)\ (2)$

... and the point b) is answered... the remaining point c) and d) shouldn't be too difficult to attack at this point... expecially the point d)!...Kind regards $\chi$ $\sigma$
 
point of clarification - how would you describe the parameters of our joint PDF in terms of the bivariate normal distribution?

also, is it independent for $\lambda$ = 1? for part d)

Paramaters of distributions always confuse me, I lost a huge bulk of marks on my mid-term because I don't understand what is trying to be said/communicated.

Would you be able to explain that?
Thanks!

edit: also for part a) when finding the marginal distributions, is it sufficient for us to say $ f_{X}(x) = \int...dy = ...$ or do we actually have to show the integration? ie, is this just a property we can use, or one which we must derive?

for part b)
i got a different answer from you, and i don't know what I did wrong

so $f_{X}(x)$ = $\frac{1}{\lambda \sqrt{2\pi}}$ $e^{(\frac{-(x-\lambda)^2)}{2(\lambda)^2}} $

and $f_{X,Y}(x,y)$ = $\frac{1}{2 \pi (\lambda)^3}$ $e^{...}$

and
$\frac{f_{X,Y}(x,y)}{f_X(x)}$ should at least have $(\lambda)^2 $in the denominator, as opposed to a $\lambda^4$ ?
additionally, my exponential was a different power from yours, i don't know how you simplified yours
 
Last edited:
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top