Minimum-variance bound for the extended maximum likelihood estimation

In summary, the conversation discusses the process of fitting a mass spectrum using a function to determine the yield of signal and background events. The conversation also mentions the uncertainty for the number of signal events being smaller than expected and questions whether this is possible without weights on the events.
  • #1
xxww
1
1
TL;DR Summary
Is there is any limit for the variance of the estimated number in the extended maximum likelihood estimation, like the minimum-variances bound for the parameters in the maximum likelihood?
I am fitting a mass spectrum using pdf(M)=Ns×S(M)+Nb×B(M; a, b) to determine the yield with the extended maximum likelihood fit, where Ns and Nb are the number of signal and background events, S(M) is the function for the signal, B(M;a, b) is the function for the background with parameters a and b.
However, the uncertainty for the Ns in the fit result is smaller than sqrt(Ns). Is it possible to have the uncertainty smaller than that obtained from simple counting? Is there any expectation like the minimum-variance bound for the Ns?

Thanks in advance.
 
  • Like
Likes ohwilleke
Physics news on Phys.org
  • #2
xxww said:
However, the uncertainty for the Ns in the fit result is smaller than sqrt(Ns).
Are there any weights on the events? If not that's a strange result.
 
  • Like
Likes ohwilleke

1. What is the minimum-variance bound for extended maximum likelihood estimation?

The minimum-variance bound for extended maximum likelihood estimation is a theoretical limit on the accuracy of parameter estimates obtained through this method. It represents the smallest possible variance that can be achieved in estimating the true values of the parameters.

2. How is the minimum-variance bound calculated?

The minimum-variance bound is calculated using the Cramér-Rao inequality, which relates the variance of an estimator to the Fisher information, a measure of the amount of information contained in a sample about the parameters being estimated.

3. What is the significance of the minimum-variance bound in statistical inference?

The minimum-variance bound is an important concept in statistical inference because it provides a benchmark for evaluating the performance of different estimation methods. If an estimator achieves a variance close to the minimum-variance bound, it can be considered efficient.

4. Can the minimum-variance bound be exceeded in extended maximum likelihood estimation?

No, the minimum-variance bound cannot be exceeded in extended maximum likelihood estimation. This means that no estimator can achieve a smaller variance than the minimum-variance bound, regardless of the sample size or other factors.

5. How does the minimum-variance bound relate to other performance measures in estimation?

The minimum-variance bound is closely related to other performance measures such as bias and efficiency. An estimator that achieves the minimum-variance bound is also unbiased and efficient, meaning that it has the smallest possible bias and variance among all unbiased estimators.

Similar threads

  • High Energy, Nuclear, Particle Physics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
903
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
923
Replies
5
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
1
Views
947
  • High Energy, Nuclear, Particle Physics
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
Back
Top