A constant could have a distribution?

  • Thread starter ivyhawk
  • Start date
  • #1
12
0

Main Question or Discussion Point

suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks
 

Answers and Replies

  • #2
statdad
Homework Helper
1,495
35
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks
Not sure what you're asking.

1) What is the difference between variance of [tex] v_t [/tex] being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?
 
  • #3
12
0
Not sure what you're asking.

1) What is the difference between variance of [tex] v_t [/tex] being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?
IG Inverted Gamma distribution



So contance does not mean that it is not a random variable? Variance here is a random variable follows a certain distribution, but it also is a constant?
 
  • #4
statdad
Homework Helper
1,495
35
You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.
 
  • #5
12
0
You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.
Thanks for following-ups.

Maybe in the textbook I am reading, the author is slightly abusing the terminology.

I got most of the part.
 

Related Threads on A constant could have a distribution?

Replies
7
Views
1K
  • Last Post
Replies
2
Views
1K
Replies
5
Views
4K
  • Last Post
Replies
2
Views
696
  • Last Post
Replies
9
Views
3K
Replies
1
Views
2K
Replies
4
Views
4K
Replies
16
Views
2K
Top