# A constant could have a distribution?

• ivyhawk
In summary, if the variance is constant, then it is a random variable following a certain distribution. However, if the variance is unknown, then it is a random variable that follows an unknown distribution.
ivyhawk
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks

ivyhawk said:
suppose error term v_t follows Normal(0, V_t) here variance is changing along the time.
Now suppose constant variance of v_t, then v_t ~ N(0, V), but if V is unknown, then V~IG(n/2,d/2)

What is the difference between variance of v_t being constant and variance of v_t being unknown?

Or a better way to state my claim?

Thanks

1) What is the difference between variance of $$v_t$$ being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?

1) What is the difference between variance of $$v_t$$ being constant and being unknown?
Partial answer: There doesn't have to be any difference: the variance could be constant and known, or constant and unknown (and so needing to be estimated). The variance could itself be random, or dependent on time, or both.

2) What do you mean by "IG(n/2,d/2)"?

IG Inverted Gamma distribution

So contance does not mean that it is not a random variable? Variance here is a random variable follows a certain distribution, but it also is a constant?

You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.

You can consider a constant to be a "degenerate" random variable - a random variable that takes a single value with probability 1, but that usually isn't needed. My comment about "constant and unknown" was meant to say that just as in a problem where one goal is to estimate any other unknown parameter (mean, median, etc), there are times when you need to estimate a variance, even though it is constant.

IF you make the assumption that the variance is itself a random variable, you can make any assumption you want about its distribution - chi square, inverse gamma, or any other non-negative distribution.

IF the variance has one of these distributions, it is most definitely NOT a constant.

Thanks for following-ups.

Maybe in the textbook I am reading, the author is slightly abusing the terminology.

I got most of the part.

## 1. What is "A constant could have a distribution"?

"A constant could have a distribution" refers to the concept that a single value can have a range of possibilities and can be represented by a probability distribution. This means that the value may not always be the same and can vary within a certain range.

## 2. How is a constant distribution different from a normal distribution?

A constant distribution differs from a normal distribution in that it does not have a bell-shaped curve and does not follow the traditional rules of standard deviation. In a constant distribution, the probability of each value occurring is equal, while in a normal distribution, the probability decreases as the values move further away from the mean.

## 3. What is an example of a constant distribution?

An example of a constant distribution is the roll of a fair die. Each number on the die (1-6) has an equal probability of occurring, making it a constant distribution. Another example is the flip of a coin, where there is a 50% chance of getting heads or tails.

## 4. How is a constant distribution useful in scientific research?

A constant distribution can be useful in scientific research as it allows for the representation of uncertainty in a single value. This can be important in studies where there is a range of possible outcomes or when dealing with complex systems that cannot be fully understood or predicted.

## 5. Can a constant distribution be applied to all variables?

No, a constant distribution may not be applicable to all variables. It is most commonly used for discrete variables, such as counts or categories, rather than continuous variables. It also assumes that all possible values have an equal probability of occurring, which may not always be the case in real-world situations.

• Differential Equations
Replies
1
Views
172
• Engineering and Comp Sci Homework Help
Replies
0
Views
479
• Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
1
Views
889
• Special and General Relativity
Replies
3
Views
838
• Set Theory, Logic, Probability, Statistics
Replies
17
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K