A constant could have a distribution?

  • Context: Graduate 
  • Thread starter Thread starter ivyhawk
  • Start date Start date
  • Tags Tags
    Constant Distribution
Click For Summary
SUMMARY

The discussion centers on the distinction between constant variance and unknown variance in the context of the error term v_t, which follows a Normal distribution. When v_t has constant variance, it is represented as v_t ~ N(0, V), whereas if the variance V is unknown, it follows an Inverted Gamma distribution IG(n/2, d/2). The variance can be constant and known, constant and unknown, or random and dependent on time. The concept of a "degenerate" random variable is introduced, emphasizing that a constant can be viewed as a specific case of a random variable.

PREREQUISITES
  • Understanding of Normal distribution and its properties
  • Familiarity with Inverted Gamma distribution (IG)
  • Knowledge of statistical concepts such as variance and random variables
  • Basic grasp of estimation techniques for unknown parameters
NEXT STEPS
  • Study the properties of the Inverted Gamma distribution (IG)
  • Learn about the concept of degenerate random variables in statistics
  • Explore estimation techniques for variance in statistical models
  • Investigate the implications of time-dependent variance in error terms
USEFUL FOR

Statisticians, data analysts, and researchers involved in statistical modeling and variance estimation will benefit from this discussion.

ivyhawk
Messages
12
Reaction score
0
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks
 
Physics news on Phys.org
ivyhawk said:
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks

Not sure what you're asking.

1) What is the difference between variance of v_t being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?
 
statdad said:
Not sure what you're asking.

1) What is the difference between variance of v_t being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?

IG Inverted Gamma distribution



So contance does not mean that it is not a random variable? Variance here is a random variable follows a certain distribution, but it also is a constant?
 
You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.
 
statdad said:
You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.

Thanks for following-ups.

Maybe in the textbook I am reading, the author is slightly abusing the terminology.

I got most of the part.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
5
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
4K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
9
Views
3K