Can You Estimate Parameters for a Non-Closed Form Probability Distribution?

  • Context: Graduate 
  • Thread starter Thread starter jimmy1
  • Start date Start date
  • Tags Tags
    Estimation Parameter
Click For Summary

Discussion Overview

The discussion centers on estimating parameters for a non-closed form probability distribution represented as a sum involving a function f. Participants explore methods for parameter estimation, particularly focusing on maximum likelihood estimation (MLE) and the challenges associated with differentiating the function with respect to one of the parameters.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant inquires about methods to estimate parameters {n, x, y} from a probability distribution without a closed form, mentioning difficulties with MLE due to the derivative with respect to n.
  • Another participant questions whether the function f is indexed by i, suggesting that if it is not, the expression could be differentiated with respect to n, and if it is indexed, Leibniz's Rule might be applicable.
  • A participant expresses uncertainty about applying Leibniz's Rule and proposes an alternative approach of estimating MLE for x and y across a range of n values (1 to 50), seeking a method to determine the best estimate among these.
  • One participant suggests selecting the estimate with the largest log likelihood as a straightforward method for determining the best parameter set.

Areas of Agreement / Disagreement

Participants have not reached a consensus on the best method for parameter estimation, and multiple approaches are being discussed, indicating ongoing debate and exploration of the topic.

Contextual Notes

The discussion involves assumptions about the differentiability of the function f and the implications of indexing, which remain unresolved. The applicability of Leibniz's Rule and the effectiveness of the proposed estimation methods are also uncertain.

jimmy1
Messages
60
Reaction score
0
I have a probability distribution of the form [tex]\sum_{i=0}^n f(n,x,y)[/tex]. There is no closed form expression for it. I need to know if there is any method that I can use to estimate the parameters {n, x, y} given some data from the above distribution.
I've tried a maximum likelihood approach, but I'm having trouble getting the derivative with respect to n. Is it possible to get this derivative, and use a maximum likelihood approch to estimate n
 
Physics news on Phys.org
Is your f indexed by i? If not, then you have (n+1)f(n,x,y), which is differentiable w/r/t/ n, as long as f is.

If f is indexed by i, then you might think of the sum as an integral and may be able to apply Leibniz's Rule (see under "Alternate form": http://en.wikipedia.org/wiki/Leibniz's_rule).
 
Last edited:
Thanks for the reply. I had a look at that Leibniz's Rule link, but I'm not fully sure how to go about using it??

Anyway, I was thinking of a slightly more simple idea. I basically need an estimate of the 3 parameters {n, x, y}, preferiably using MLE. Since it's difficult to get the derivative w.r.t n, I was thinking of trying various values of n (say n=1,..,50), and for each value of n estimate MLE of x,y.

So basically, I now end up with 50 different estimates for {n, x, y}. So my question is, is there any mathematical way to tell which one of these 50 estimates is the best one? ie. Is there some sort of likelihood test I could use?
 
I'd just look at the (log) likelihood numbers and select the largest.
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K