Can I omit using an indicator function when estimating an MLE?

In summary, the Uniform distribution can't use derivatives because the PDF is flat and also because the range of the PDF depends on the actual parameter. To find the maximum likelihood estimator for the Uniform distribution, you need to use either the indicator function or the order statistic.
  • #1
cdux
188
0
When looking for a maximum likelihood estimator for the Uniform distribution I noticed that a common method is to use an indicator function. My initial understanding is that the reason for that is for taking into account the region of ℝ that x produces - or not - a non-zero probability.

If I go on finding the maximum likelihood function without involving an indicator function in the math and then in the end only mention the boundaries of effect of the result, is it correct?

I'd be happy if I can avoid it (correctly) because I find the indicator function method very unintuitive for my style.
 
Physics news on Phys.org
  • #2
Hey cdux.

The uniform distribution can't use derivatives because the PDF is flat and also because the range of the PDF depends on the actual parameter (unlike the Normal distribution and others where it doesn't).

So for this particular case, you need to use either the indicator function or the order statistic to derive the estimator for the parameter of the uniform distribution.

If you are using the order statistic, then make sure you check the bias (if any iexists). Also note that when I say order statistic I mean the maximum order statistic X(n) where n is the sample size and X(n) is the nth order statistic distribution.
 
  • Like
Likes 1 person
  • #3
I've found both types of solutions, and I have a hard time understanding the difference between the "easy" one (the 'ordered statistic' one) and the one with the indicator function: In both cases they seem to conclude with the same thing: "Since b ['a' was zero in that case but it applies for both], since b >= x by our initial assumptions, then it comes to reason that -n/b^n which is a decreasing function, must be satisfied with b = max{X1,X2,...Xn}".

Because that's what I was already doing, evaluating l'(b), then looking at it being a decreasing function and then concluding that b should be minimized to maximize L(b) which is our goal.

Unless.. the solutions with indicator functions I've been reading had been redundant and they "talked too much" so to speak. Because if you are going to explain the order of the random variables why use an indicator function to begin with?
 
Last edited:
  • #4
The indicator formulation is something that can be maximized in the context of the MLE (which is all about maximizing some quantity).

The order statistics on the other hand is an intuitive method that uses the nature of the distribution and knowing that the last value is a good estimate of the actual parameter.

So in short, MLE looks at maximizing something in general and order statistic is an intuitive estimator based on the specific nature of the uniform distribution.

Whatever estimator is good is what you use and there are always multiple estimators for a single parameter (MLE, moment estimator, non-parametric etc).
 
  • #5


I understand your concern about using an indicator function in the estimation of maximum likelihood for the Uniform distribution. However, it is important to note that the indicator function serves a specific purpose in this process. It helps to account for the possibility of x producing a non-zero probability within the specified range of values in ℝ.

While it may seem unintuitive to you, the use of the indicator function is necessary to accurately estimate the maximum likelihood for the Uniform distribution. Simply mentioning the boundaries of effect in the end without involving the indicator function in the math may lead to incorrect results.

In science, it is important to follow established methods and techniques in order to ensure accurate and reliable results. While it may not align with your personal style, it is important to recognize the importance of the indicator function in this process and to use it correctly in order to obtain accurate estimates of the maximum likelihood for the Uniform distribution. I would recommend exploring the concept further and seeking guidance from a mentor or colleague if you are still unsure about its use.
 

1. What is an indicator function?

An indicator function is a mathematical function that takes on the value of 1 if a specified condition is met, and 0 if the condition is not met. It is commonly used in statistics to represent binary or categorical variables.

2. Why is an indicator function used when estimating MLE?

MLE (maximum likelihood estimation) involves finding the parameter values that maximize the likelihood of a particular model. The indicator function is used to limit the likelihood function to only those data points that meet certain conditions, making the estimation more accurate.

3. Can I omit using an indicator function when estimating MLE?

In most cases, it is not recommended to omit using an indicator function when estimating MLE. This is because the indicator function helps to accurately represent the model and the data, resulting in more reliable parameter estimates.

4. Are there any situations where an indicator function can be omitted when estimating MLE?

There may be cases where an indicator function is not necessary when estimating MLE, such as when the data is already binary or categorical and does not require further classification. However, it is important to carefully consider the use of an indicator function in each specific scenario.

5. What are the potential consequences of omitting an indicator function when estimating MLE?

Omitting an indicator function when estimating MLE can lead to biased and inaccurate parameter estimates. This can result in incorrect conclusions and potentially affect the overall validity of the model and its predictions.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
802
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
893
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
900
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
756
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
4K
Back
Top