Finding MLE for $\theta$ of Statistical Product Model

In summary, In summary, the maximum likelihood estimator for the parameter $\theta$ using the maximum likelihood method is $\theta^n(1-\theta)^{\sum_{i=1}^nx_i−n}=\left ({\sum_{i=1}^nx_i−n}\right )\cdot \frac{1}{1 -\theta}$.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :giggle:

For $n \in \mathbb{N}$ we consider the discrete statistical product model $(X ,(P_{\theta})_{\theta \in\Theta})$ with $X = \mathbb{N}^n$, $\Theta = (0, 1)$ and $p_{\theta}(x_i) = \theta(1 -\theta)^{x_i−1}$
for all $x_i \in \mathbb{N}, \theta \in \Theta$. So $n$ independent, identical experiments are carried out, the outcomes of which are modeled by independent, geometrically distributed random variables with an unknown probability of success $\theta$.
(a) Give the corresponding likelihood function.
(b) Determine an estimator for the parameter $\theta$ using the maximum likelihood method (for samples $x$ that do not consist only of ones). Note intermediate steps.
(c) You observe the following sample $x$: $$4 \ \ \ \ \ 2 \ \ \ \ \ 7 \ \ \ \ \ 4 \ \ \ \ \ 3 \ \ \ \ \ 1 \ \ \ \ \ 8 \ \ \ \ \ 2 \ \ \ \ \ 4 \ \ \ \ \ 5$$ Give the concrete estimated value for $\theta$ for $x$ using the estimator of part (b).I have done the following :

(a) The likelihood function is $$L_x(\theta)=\prod_{i\in \mathbb{N}}p_{\theta}(x_i)$$ or not? Can wecalculate that further or do we let that as it is?(b) We have to calculate the supremum of $L_x(\theta)$ as for $\theta$, right?

:unsure:
 
Physics news on Phys.org
  • #2
Hi mathmari,

Nice job so far. Here are a few ideas to keep things moving along.

(a) The product should be for $1\leq i\leq n$. Using this fact and the formula for $p_{\theta}(x_{i})$, we can take your work a step further to obtain a closed-form expression for the likelihood.

(b) You will need to calculate the value of $\theta$ that maximizes the likelihood function from part (a), given the data $x_{i}$. This means the value of $\theta$ you calculate will depend on (i.e., be a function of) $x_{i}$. This can be done by differentiating with respect to $\theta$, setting this equal to zero and solving for $\theta.$ However, to make your life easier, I would strongly suggest taking the logarithm of the likelihood function before you calculate the value of $\theta$. Since the logarithm function is monotonic/order-preserving, the so-called "log-likelihood" function and the original likelihood function are maximized for the same value of $\theta$.

Feel free to let me know if anything remains unclear.
 
  • #3
GJA said:
(a) The product should be for $1\leq i\leq n$. Using this fact and the formula for $p_{\theta}(x_{i})$, we can take your work a step further to obtain a closed-form expression for the likelihood.

We have that $$L_x(\theta)=\prod_{i=1}^np_{\theta}(x_i)=\prod_{i=1}^n \theta(1 -\theta)^{x_i−1}= \theta^n(1 -\theta)^{\sum_{i=1}^nx_i−n} $$

Is that correct? :unsure:
 
  • #4
mathmari said:
We have that $$L_x(\theta)=\prod_{i=1}^np_{\theta}(x_i)=\prod_{i=1}^n \theta(1 -\theta)^{x_i−1}= \theta^n(1 -\theta)^{\sum_{i=1}^nx_i−n} $$

Is that correct? :unsure:

Exactly, nice job!

For part (b) you're trying to determine the value for $\theta$ that maximizes $L_{x}(\theta)$ for the given data $x_{i}$; i.e., you're thinking of the $x_{i}$ as fixed so that $L_{x}(\theta)$ is being considered as a function of $\theta$. Using the fact that $\ln\left[L_{x}(\theta)\right]$ and $L_{x}(\theta)$ are maximized for the same value of $\theta$ will make solving for $\theta$ less laborious. However you choose to proceed, taking a derivative, setting it equal to zero and solving for $\theta$ will produce the desired result.
 
  • #5
GJA said:
For part (b) you're trying to determine the value for $\theta$ that maximizes $L_{x}(\theta)$ for the given data $x_{i}$; i.e., you're thinking of the $x_{i}$ as fixed so that $L_{x}(\theta)$ is being considered as a function of $\theta$. Using the fact that $\ln\left[L_{x}(\theta)\right]$ and $L_{x}(\theta)$ are maximized for the same value of $\theta$ will make solving for $\theta$ less laborious. However you choose to proceed, taking a derivative, setting it equal to zero and solving for $\theta$ will produce the desired result.

So we have \begin{align*}&g(\theta) = \ln \left (L_x(\theta)\right )=\ln \left (\theta^n(1 -\theta)^{\sum_{i=1}^nx_i−n}\right ) =\ln \left (\theta^n\right )+\ln \left ((1 -\theta)^{\sum_{i=1}^nx_i−n}\right )=n\cdot \ln \left (\theta\right )+\left ({\sum_{i=1}^nx_i−n}\right )\cdot \ln \left (1 -\theta\right )\\ &g'(\theta)=n\cdot \frac{1}{\theta}-\left ({\sum_{i=1}^nx_i−n}\right )\cdot \frac{1}{1 -\theta} \\ &g'(\theta)=0 \Rightarrow n\cdot \frac{1}{\theta}-\left ({\sum_{i=1}^nx_i−n}\right )\cdot \frac{1}{1 -\theta} =0 \Rightarrow n\cdot \frac{1}{\theta}=\left ({\sum_{i=1}^nx_i−n}\right )\cdot \frac{1}{1 -\theta} \Rightarrow n\cdot (1-\theta)=\left ({\sum_{i=1}^nx_i−n}\right )\cdot \theta\\ &\Rightarrow n-n\cdot \theta=\theta\cdot \sum_{i=1}^nx_i−n\cdot \theta\Rightarrow n=\theta\cdot \sum_{i=1}^nx_i \Rightarrow \theta= \frac{n}{\sum_{i=1}^nx_i}\end{align*} Is that correct and complete ? :unsure:

At (c) do we use $\theta= \frac{n}{\sum_{i=1}^nx_i}$ for $n=10$ and $\sum_{i=1}^nx_i=4 + 2 + 7 + 4 + 3 + 1 + 8+ 2 + 4 + 5=40$ ? :unsure:
 
  • #6
mathmari said:
So we have \begin{align*}&g(\theta) = \ln \left (L_x(\theta)\right )=\ln \left (\theta^n(1 -\theta)^{\sum_{i=1}^nx_i−n}\right ) =\ln \left (\theta^n\right )+\ln \left ((1 -\theta)^{\sum_{i=1}^nx_i−n}\right )=n\cdot \ln \left (\theta\right )+\left ({\sum_{i=1}^nx_i−n}\right )\cdot \ln \left (1 -\theta\right )\\ &g'(\theta)=n\cdot \frac{1}{\theta}-\left ({\sum_{i=1}^nx_i−n}\right )\cdot \frac{1}{1 -\theta} \\ &g'(\theta)=0 \Rightarrow n\cdot \frac{1}{\theta}-\left ({\sum_{i=1}^nx_i−n}\right )\cdot \frac{1}{1 -\theta} =0 \Rightarrow n\cdot \frac{1}{\theta}=\left ({\sum_{i=1}^nx_i−n}\right )\cdot \frac{1}{1 -\theta} \Rightarrow n\cdot (1-\theta)=\left ({\sum_{i=1}^nx_i−n}\right )\cdot \theta\\ &\Rightarrow n-n\cdot \theta=\theta\cdot \sum_{i=1}^nx_i−n\cdot \theta\Rightarrow n=\theta\cdot \sum_{i=1}^nx_i \Rightarrow \theta= \frac{n}{\sum_{i=1}^nx_i}\end{align*} Is that correct and complete ? :unsure:

At (c) do we use $\theta= \frac{n}{\sum_{i=1}^nx_i}$ for $n=10$ and $\sum_{i=1}^nx_i=4 + 2 + 7 + 4 + 3 + 1 + 8+ 2 + 4 + 5=40$ ? :unsure:

You're correct on everything; great job!
 
  • #7
Here is a plot of the likelihood function for part (c), where $n=10$ and $\sum_{i}x_{i} = 40$. As you can see, it's maximized for $\theta=0.25$, as it should be from your derivation in part (b).

Likelihood_Function_Plot.png
 

1. What is the Statistical Product Model?

The Statistical Product Model (SPM) is a statistical method used to analyze data and make predictions about a population based on a sample. It assumes that the observations in the sample are independent and identically distributed, and that the population follows a specific probability distribution.

2. Why is it important to find the Maximum Likelihood Estimator (MLE) for $\theta$ in the SPM?

Finding the MLE for $\theta$ in the SPM allows us to estimate the most likely value of the parameter $\theta$ for the population, based on the observed data. This is important because it allows us to make more accurate predictions and inferences about the population.

3. How is the MLE for $\theta$ calculated in the SPM?

The MLE for $\theta$ is calculated by finding the value of $\theta$ that maximizes the likelihood function, which is a function of the observed data and the parameter $\theta$. This can be done analytically or numerically using optimization techniques.

4. What are the assumptions of the SPM when finding the MLE for $\theta$?

The SPM assumes that the observations in the sample are independent and identically distributed, and that the population follows a specific probability distribution. It also assumes that the sample is a random sample from the population.

5. Can the MLE for $\theta$ be biased?

Yes, the MLE for $\theta$ can be biased. This can occur if the assumptions of the SPM are not met, or if the sample size is too small. In these cases, the MLE may not accurately estimate the true value of $\theta$ for the population.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
740
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
721
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
958
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
918
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
853
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
Back
Top