Maximum Likelihood Estimation Formula Simplification and Solution for p?

Click For Summary
SUMMARY

The discussion centers on the simplification and solution of the Maximum Likelihood Estimation (MLE) formula for the parameter \( p \). The participants analyze two complex MLE formulas involving trigonometric functions and powers of \( p \). The primary goal is to derive \( p \) from observed values \( k_i \) to maximize the likelihood of the observations. The conversation highlights the necessity of a probability distribution and specific sample data to compute the MLE effectively.

PREREQUISITES
  • Understanding of Maximum Likelihood Estimation (MLE)
  • Familiarity with probability distributions and their parameters
  • Knowledge of trigonometric functions and their properties
  • Ability to perform calculus operations such as differentiation
NEXT STEPS
  • Study the derivation of Maximum Likelihood Estimation for different probability distributions
  • Learn about the role of sample data in estimating parameters
  • Explore simplification techniques for complex mathematical formulas
  • Investigate the application of MLE in real-world statistical problems
USEFUL FOR

Statisticians, data scientists, and researchers involved in statistical modeling and parameter estimation using Maximum Likelihood methods.

SheepLon
Messages
4
Reaction score
0
Hey guys !

My mother language is not English by the way. Sorry for spelling and gramme. :)

I'm curious to see if you can help me with my problem.I have already tried for almost a week and did not get to a solution. I also know, that the Maximum likelihood estimation is part of statistics and probability calculation. But since it is about formula transformation, I used the analysis forum.

The Maximum likelihood estimation formula is the following:

$L(\theta = p) = \prod\limits_{i=1}^N \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}}+ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k_i+b}{2}} \cdot (1-p)^{\frac{k_i-b}{2}} \right]$

$= \prod\limits_i \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \left(\sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}} + \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k_i+b}{2}} \cdot (1-p)^{\frac{k_i-b}{2}} \right)\right]$

where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$I'm looking forward to your ideas.

"Hints":
- I exclude first the $p^{\frac{k_i}{2}} \cdot (1-p)^{\frac{k_i}{2}}$
That gave me at least another product without $m$, that I was able to pull out of the sum. However the other $p$'s I was not able to pull out.

**********************************************************************
**********************************************************************
**********************************************************************

A second, very similar Maximum likelihood estimation I need is the following
$L(\theta = p) = \prod\limits_{i=1}^N \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}}+\sum\limits_{m = 1}^{r-1} \frac{1}{r} \cdot 2^{h_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{h_i-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{h_i+b}{2}} \cdot (1-p)^{\frac{h_i-b}{2}} \right]$where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$
 
Last edited:
Physics news on Phys.org
What, exactly, is the question? To calculate a "maximum likelihood estimate", we have to be given a probabililty distribution depending on some parameter as well as a specific sample of that distribution. The "maximum likelihood estimate" of the parameter is the value that makes that specific sample most likely. I see none of those here. Instead you give a "formula" depending on a number of things with no explanation of what they are or what they mean.

"My mother language is not English by the way. Sorry for spelling and gramme."
Actually your only misspelling is of "grammar"!
 
Thank you for the quick respond. I thought you could read the parameter for the ML estimator directly. I would like to determine $p$ by observed values $k_i$ so that the observation is as plausible as possible. I already set up the ML-estimator. However, the probabililty distribution for the first example is the following formula:

$P_k = \mathbb{P}\left(S_k = \{0,r\}\right) = \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k-a}{2}} \cdot (1-p)^{\frac{k+a}{2}}+ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k+b}{2}} \cdot (1-p)^{\frac{k-b}{2}} \right]$

$= \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \left(\sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k-a}{2}} \cdot (1-p)^{\frac{k+a}{2}}+ \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k+b}{2}} \cdot (1-p)^{\frac{k-b}{2}} \right)\right]$

where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$ and $k \in \mathbb{N}\backslash\{0\}$. The sum from $k=1$ to $\infty$ converges against $1$.

Does that make any sense to you? Do you know what I mean? : )
 
Is that too hard or does someone got an idea. :)
 
You still haven't given us a complete problem from which we can give an estimate. In addition to a probability distribution, containing a parameter, we must be given a specific sample. Then the "maximum likelihood estimate" for the parameter is the value of the parameter that makes that sample "most likely"- having the highest probability.
 
Sorry, but I don’t get what further information is needed. In my opinion we got everything we need in order to do a ML- Estimation für p for having the highest probability. The solution should depend on the sample I get for all my k’s and depending on the values a and b that are fix values.

So you don’t need more information. I got the ML equation, which needs to be simplified, Derivated and solved for p.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
Replies
11
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
32
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K