# A constant could have a distribution?

## Main Question or Discussion Point

suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks

Related Set Theory, Logic, Probability, Statistics News on Phys.org
Homework Helper
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks

1) What is the difference between variance of $$v_t$$ being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?

1) What is the difference between variance of $$v_t$$ being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?
IG Inverted Gamma distribution

So contance does not mean that it is not a random variable? Variance here is a random variable follows a certain distribution, but it also is a constant?

Homework Helper
You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.

You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.
Thanks for following-ups.

Maybe in the textbook I am reading, the author is slightly abusing the terminology.

I got most of the part.