Probability Homework Question 2

  • Context: MHB 
  • Thread starter Thread starter pinky14
  • Start date Start date
  • Tags Tags
    Homework Probability
Click For Summary

Discussion Overview

The discussion revolves around a probability homework question concerning the minimization of the variance of a linear combination of two uncorrelated random variables, X and Y, which are measurements of an unknown quantity μ. Participants explore the implications of the variables' means and variances, and the conditions for uncorrelatedness.

Discussion Character

  • Homework-related
  • Mathematical reasoning
  • Conceptual clarification

Main Points Raised

  • Some participants note that the covariance of uncorrelated random variables X and Y is zero, leading to the simplification of the variance formula for W.
  • There is a discussion about the expression for the variance of W, specifically that it can be expressed as $\sigma_W^2 = \alpha^2 \sigma_X^2 + (1 - \alpha)^2 \sigma_Y^2$ when the covariance is zero.
  • One participant questions whether to take the derivative to find the value of α that minimizes the variance, indicating uncertainty about the approach to solving the problem.
  • Another participant emphasizes the need to find α within the interval [0,1] that minimizes the quadratic function derived from the variance expression.

Areas of Agreement / Disagreement

Participants generally agree on the definition of uncorrelated random variables and the resulting implications for covariance. However, there remains uncertainty about the specific steps to minimize the variance and the interpretation of the problem.

Contextual Notes

Some participants express confusion regarding the application of the variance formula and the process of minimization, indicating potential gaps in understanding the underlying concepts.

pinky14
Messages
5
Reaction score
0
Suppose that X, Y are uncorrelated random variables which are each measurements of some unknown quantity $\mu$. Both random variables have $\mu_{X} = \mu_{Y} = \mu$, but $\sigma^2_{X} > \sigma^2_{Y}$. Determine the value of $\alpha$ in [0, 1] which will minimize the variance of the random variable W = $\alpha X +(1 - \alpha)$Y. Note that E(W) = $\mu$ for any $\alpha$ so the minimal variance α gives the “best” linear combination of X, Y to use in estimating $\mu$.
 
Physics news on Phys.org
Hi pinky14,

In the future, please show your thoughts or what you've tried.

Show that $\sigma_W^2 = \alpha^2 \sigma_X^2 + (1 - \alpha)^2\sigma_Y^2 + 2\alpha(1 - \alpha)\operatorname{Cov}(X,Y)$. Since $X$ and $Y$ are uncorrelated, what can you say about $\operatorname{Cov}(X,Y)$?
 
Euge said:
Hi pinky14,

In the future, please show your thoughts or what you've tried.

Show that $\sigma_W^2 = \alpha^2 \sigma_X^2 + (1 - \alpha)^2\sigma_Y^2 + 2\alpha(1 - \alpha)\operatorname{Cov}(X,Y)$. Since $X$ and $Y$ are uncorrelated, what can you say about $\operatorname{Cov}(X,Y)$?

So for 2 random variables to be "uncorrelated," their covariance has to be 0? (E(XY)-E(X)E(Y))? I wasn't really sure how to start with this problem. The problem says their means are equal but that the variance of X is larger than Y but I am not sure what to do with the equation for W that they give. When they say to minimize, do we take the derivative? I am pretty lost in my probability class.
 
pinky14 said:
So for 2 random variables to be "uncorrelated," their covariance has to be 0?

That's correct. So then $\sigma_W^2 = \alpha^2 \sigma_X^2 + (1 - \alpha)^2 \sigma_Y^2$. You will need to find $\alpha\in [0,1]$ that minimizes the quadratic function $f(\alpha) := \alpha^2 \sigma_X^2 + (1 - \alpha)^2 \sigma_Y^2$.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
5
Views
5K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K