Correlation coeff in conditional distribution

Click For Summary

Discussion Overview

The discussion revolves around deriving the correlation coefficient in the context of conditional distributions, specifically focusing on the relationship between random variables Z and Θ. Participants explore the implications of independence, covariance, and the formulas for expectation and variance.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant attempts to derive the correlation coefficient and presents a formula involving covariance and variance.
  • Another participant corrects the initial interpretation, clarifying that the correlation coefficient should be represented as ρ = σ / √(1 + σ²) rather than σ² / √(1 + σ²).
  • A later reply acknowledges the correction and confirms the intended representation of ρ.
  • One participant inquires about the derivation of formulas for conditional expectation and variance, suggesting a need for clarity on the relationship between Θ and X.
  • Another participant emphasizes the importance of stating the relationship between Θ and X and suggests visual aids and a structured approach to solving the problem.
  • There is a mention of minimizing Linear Least Mean Squared error as a method to derive the conditional expected value.

Areas of Agreement / Disagreement

Participants generally agree on the correction regarding the representation of the correlation coefficient. However, there is no consensus on the derivation of the formulas for conditional expectation and variance, and the relationship between Θ and X remains unclear.

Contextual Notes

Some participants express uncertainty regarding the assumptions needed for the relationships between the variables, particularly concerning independence and the implications for covariance and variance calculations.

georg gill
Messages
151
Reaction score
6
upload_2017-10-6_13-42-6.png

upload_2017-10-6_13-42-30.png

Can someone derive: ##\frac{Cov(Z+\Theta),\Theta)}{\sqrt{Var(Z+\Theta)Var(\Theta)}}=\frac{\sigma ^2}{\sqrt{1+\sigma ^2}}##

My attempt:

Numerator:

##Cov(X,Y)=E[(X-E(X))(Y-E(Y))]=E[(Z+\Theta-\theta)(\Theta-\mu)]##

The denumerator is pretty simple:

##\sqrt{(1+\sigma ^2)\sigma ^2}##
 
Physics news on Phys.org
I presume you actually want ##\rho## which has a value of ##\frac{\sigma}{\sqrt{1 + \sigma^2}}## not ##\frac{\sigma^2}{\sqrt{1 + \sigma^2}}##

key ideas:
1.) Break this up into small manageable subproblems

2.) Remember implications of independence between ##Z## and ##\Theta##, i.e. they have zero covariance which also means their combined variance is just ##var(Z) =1## plus ##var(\Theta) = \sigma^2##

-- numerator --
in general for covariance of two random variables, A,B, we have
##cov\big(A, B\big) = E[(A)(B)] - E[(A)]E[(B)]##

split this up into two lines

##E[(Z + \Theta)(\Theta)] = E[Z \Theta] + E[\Theta^2]##
##E[Z + \Theta]E[\Theta] = E[Z]E[\Theta]+ E[\Theta]E[\Theta]##

notice that our actual expression for the numerator is the first line minus the second

## = \big(E[Z \Theta] + E[\Theta^2]\big)- \big(E[Z]E[\Theta]+ E[\Theta]E[\Theta]\big) =\big(E[Z \Theta]- E[Z]E[\Theta]\big) + \big(E[\Theta^2]- E[\Theta]E[\Theta]\big)##
##= \big(cov(Z,\Theta)\big) + \big(var(\Theta)\big) = \big(0\big) + \big(\sigma^2\big) = \sigma^2##

-- denominator --

##\sqrt{\Big(\big(1 + \sigma^2\big)\big(\sigma^2\big)\Big)} = \sqrt{\big(1 + \sigma^2\big)}\sqrt{\big(\sigma^2\big)} = \sigma\sqrt{\big(1 + \sigma^2\big)}##--- combine numerator and denominator --

##\rho = \frac{\sigma^2}{\sigma\sqrt{\big(1 + \sigma^2\big)}} = \frac{\sigma}{\sqrt{\big(1 + \sigma^2\big)}}##
 
Last edited:
  • Like
Likes   Reactions: georg gill and jim mcnamara
StoneTemplePython said:
I presume you actually want ##\rho## which has a value of ##\frac{\sigma}{\sqrt{1 + \sigma^2}}## not ##\frac{\sigma^2}{\sqrt{1 + \sigma^2}}##

##
Yes sorry I meant ##\rho## which has a value of ##\frac{\sigma}{\sqrt{1 + \sigma^2}}##. Thanks for the answer!
 
How do they arrive at the formula for expectation and variance in the end. I am thinking about:

##E[\Theta|X=x]=E[\Theta]+\rho\sqrt{\frac{Var(\Theta)}{Var(X)}}(x-E[X])##

and

##Var(\Theta|X=x)=Var(\Theta)(1-\rho^2)##

Where do they take this formulas from. Can someone derive them? I get the calculations they do with them.
 
You should start by stating the relationship between ##\Theta## and ##X##. I didn't think this was directly needed when dealing with ##Z## and ##\Theta## but it seems vital now and I don't see this clearly stated anywhere, and ##\Theta## and ##X## seem to be being introduced in your exercise 8b as if you have familiarity with the relationship from the text or perhaps example 8a. Consider this a 'for avoidance of doubt', the relationship is ____, type of statement.

After stating the relationship, it may be prudent to try to draw this out via a picture. Then make an attempt at solving this, either using their stated approach or via direct application of Bayes.

The reality is, once you've stated and formulated everything, the expected values should be easy -- and if you get stuck, following the units should be helpful here as well.

I could weigh in after all of this if needed. But you should be able to (a) clearly state the relationship between ##\Theta## and ##X## and (b) make some progress on your own first.

- - - -
edit:

While I still think a lot more details should be provided, I knew this equation looked quite familiar. Your problem apparently is trying to minimize Linear Least Mean Squared error. I.e. you are trying to minimize

##E\big[(\Theta - \alpha X - b)^2\big]##

do the calculus on this minimization problem (i.e. optimize with respect to a and b) and you'll recover the abstract form of your conditional expected value
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 43 ·
2
Replies
43
Views
6K
Replies
4
Views
1K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K