A constant could have a distribution?

  • Context: Graduate 
  • Thread starter Thread starter ivyhawk
  • Start date Start date
  • Tags Tags
    Constant Distribution
Click For Summary

Discussion Overview

The discussion revolves around the concept of variance in statistical models, particularly focusing on the differences between constant variance and unknown variance. Participants explore the implications of treating variance as a random variable and its relationship to distributions such as the Inverted Gamma distribution.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants propose that the variance of the error term v_t can be constant and known, or constant and unknown, necessitating estimation.
  • Others argue that if the variance is treated as a random variable, it can follow various distributions, including chi-square or inverse gamma, which implies it is not a constant.
  • A participant questions the terminology used in their textbook, suggesting that the author may be misusing terms related to variance.
  • There is a discussion about the concept of a "degenerate" random variable, which takes a single value with probability 1, and how this relates to the idea of constant variance.

Areas of Agreement / Disagreement

Participants express differing views on whether constant variance can also be considered a random variable and how it relates to its distribution. The discussion remains unresolved regarding the implications of these concepts.

Contextual Notes

Participants note that the variance could be dependent on time or random, and there are unresolved assumptions about the definitions and implications of constant versus unknown variance.

ivyhawk
Messages
12
Reaction score
0
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks
 
Physics news on Phys.org
ivyhawk said:
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks

Not sure what you're asking.

1) What is the difference between variance of [tex]v_t[/tex] being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?
 
statdad said:
Not sure what you're asking.

1) What is the difference between variance of [tex]v_t[/tex] being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?

IG Inverted Gamma distribution



So contance does not mean that it is not a random variable? Variance here is a random variable follows a certain distribution, but it also is a constant?
 
You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.
 
statdad said:
You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.

Thanks for following-ups.

Maybe in the textbook I am reading, the author is slightly abusing the terminology.

I got most of the part.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
5
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
4K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K