How to view conditional variance intuitively?

  • Context: Undergrad 
  • Thread starter Thread starter yamata1
  • Start date Start date
  • Tags Tags
    Conditional Variance
Click For Summary

Discussion Overview

The discussion revolves around understanding the concept of conditional variance, particularly in the context of a Normalized Gaussian random variable. Participants explore both intuitive and mathematical approaches to derive the conditional variance when the data is divided into positive and negative segments.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant presents a conditional variance result of ##1−\frac{2}{π}## and requests a derivation of this result.
  • Another participant notes that the inquiry for a mathematical derivation contrasts with the title's focus on intuitive understanding of conditional variances.
  • It is suggested that conditional variances need not be larger or smaller than the variance of the original distribution, which is a key aspect of forming intuition about these variances.
  • A participant discusses the behavior of Gaussian variables in lower dimensions, indicating that dividing the data further leads to a decrease in variance, contrary to the expectation that tails should exhibit greater volatility.
  • One participant expresses confusion regarding the derivation and seeks a simpler computation or property to arrive at the conditional variance result.
  • A detailed mathematical derivation is provided for the variance of a random variable with a specified probability density function, leading to the conclusion that the variance is indeed ##1 - \frac{2}{\pi}##.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the best approach to understand or derive the conditional variance. There are multiple viewpoints regarding the balance between intuitive understanding and mathematical derivation.

Contextual Notes

Some participants express uncertainty about the derivation process and the assumptions underlying the calculations, indicating a need for clarity on the properties of conditional variances in this context.

yamata1
Messages
61
Reaction score
1
We have a sample of X, a Normalized Gaussian random variable.We divide the data into positive and negative.
Each will have a conditional variance of ## 1−\frac{2}{π}## .
Can someone show how to get this result ?

I found this problem here (page 3) : https://www.dropbox.com/s/18pjy7gmz0hl6q7/Correlation.pdf?dl=0

Thank you.
 
Physics news on Phys.org
The title of the thread speaks of viewing something intuitively but your question
Can someone show how to get this result ?
seems ask for a mathematical derivation.

The relevant passage in the document concerns forming a correct intuition about conditional variances - the correct intuition being that they need not be larger (or smaller) than the variance of the original distribution that is constrained.

The problem becomes much easier when we consider the behavior in lower dimensions for Gaussian variables. The intuition is as follows. Take a sample of X, a Normalized Gaussian random variable. Verify that the variance is 1. Divide the data into positive and negative. Each will have a conditional variance of 1−2/π=≈0.363. Divide the segments further, and there will be additional drop in variance. And, although one is programmed to think that the tail should be more volatile, it isn’t so; the segments in the tail have an increasingly lower variance as one gets further away, see in Fig.4.
 
Stephen Tashi said:
The title of the thread speaks of viewing something intuitively but your question

seems ask for a mathematical derivation.

The relevant passage in the document concerns forming a correct intuition about conditional variances - the correct intuition being that they need not be larger (or smaller) than the variance of the original distribution that is constrained.
I am failing to understand the way to get this answer . Is there a simple computation or property that easily gives this answer ? I tried the formula for ##v_x(Q)## without success.
 
Consider the case when the random variable ##X## has a probability density given by ##g(x) = 2 \frac{1}{\sqrt{2 \pi} } e^{-{x^2/2}}## for ##x \ge 0##.

The variance of ##X## is ##\sigma^2_X = \int_0^\infty x^2 g(x) dx -( \int_0^{\infty} x g(x) dx)^2##

##\int_0^\infty x^2 g(x) dx = \int_0^\infty x^2 2 \frac{1}{\sqrt{2 \pi}} e^{-{x^2/2}} dx##
## = 2 \int_0^\infty x^2 \frac{1}{\sqrt{2 \pi}} e^{-{x^2/2}} dx ##
##= \int_{-\infty}^{\infty} x^2 \frac{1}{\sqrt{2 \pi}} e^{-{x^2/2}} dx = 1##
since the last integral is the same as computing the variance of a normal distribution that has mean zero and variance 1.

##\int_0^\infty x g(x) dx = \int_0^\infty x (2 \frac{1}{\sqrt{2 \pi}} e^{-{x^2/2}}) dx##
## = ( -2 \frac{1}{\sqrt{2 \pi}} e^{-{x^2/2}}) |_0^\infty ##
##= 0 - ( -2 \frac{1}{\sqrt{2 \pi}})##
## = \frac{\sqrt{2}}{\sqrt{\pi}}##

So ##\sigma^2_X = 1 - (\frac{\sqrt{2}}{\sqrt{\pi}})^2 = 1 - 2/\pi##
 
  • Like
Likes   Reactions: zinq, yamata1 and FactChecker

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
0
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
4K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K