Maximizing θ with Probability Mass Function and Marbles Data

In summary, the conversation discusses using a probability mass function for a given set of data on marbles of different colors and determining the maximum likelihood of θ using this data. It also mentions the use of a multinomial distribution and finding the likelihood function and loglikelihood function. The conversation ends with a clarification on how to arrive at the estimate for theta by setting up and differentiating the log likelihood function.
  • #1
icedsake
7
0
in need of help for how to do this question
given probability mass function:
x 1 2 3 4
p(x) 1/4(θ+2) 1/4(θ) 1/4(1-θ) 1/4(1-θ)

Marbles
1=green
2=blue
3=red
4=white

For 3839 randomly picked marbles
green=1997
blue=32
red=906
white=904

what is the max likelihood of θ using this data?
 
Physics news on Phys.org
  • #2
What is the likelihood function in this case?
 
  • #3
oops i left out that x=1,2,3,4 are of binomial distributions...
would the likelihood function be the pmf of binomial dist.?
= (nCx) p^x (1-p)^(n-x)

and the loglikelihood function be:
L(p)= log(nCx) + xlog(p) + (n-x)log(1-p) ??
 
  • #4
Is it a binomial, or a multinomial distribution? Binomial has two possible outcomes; here you have four.
 
  • #5
i'm a little lost at this point, in the above section it says that for example green marbles is modeled by a r.v. N1 with a binomial (n, 1/4(θ+2)) distribution and blue is modeled by r.v. N2 with a binomial (n,1/4(θ)) dist. where n in both cases is total # of marbles (3839 in this case)

so I'm assuming red and white have similar binomial dist.
 
  • #6
It is possible to look at multinomial r.v.'s as a vector of binomial r.v.'s.

The likelihood function (nCx) p^x (1-p)^(n-x) represents just one of the 4 variables, though (e.g., green vs. not green). To capture all individual colors you need to think in terms of a multinomial distribution with multiple (> 2) outcomes.
 
  • #7
hmm..so in this case i should use the multinomial prob. mass function to get the likelihood function.. then take the natural log of it correct?
Do I differentiate now and how do I arrive at the estimate for theta?
 
  • #8
You should set up the log likelihood function L, then differentiate it with respect to theta, set it to zero, and solve for theta: L'(θ) = 0 so θ* = L'-1(0). Then check L"(θ*) < 0 to make sure it's a maximum and not a minimum.
 
  • #9
thanks for the clarifications =)
 

What is the maximum likelihood estimate (MLE)?

The maximum likelihood estimate is a statistical method used to estimate the parameters of a model by finding the values that make the observed data most probable. It assumes that the observed data is a random sample from a population with a known probability distribution.

How is the maximum likelihood estimate calculated?

The maximum likelihood estimate is calculated by finding the values of the model parameters that maximize the likelihood function, which is a measure of how likely the observed data is for a given set of parameter values. This can be done analytically or through numerical optimization methods.

What is the difference between maximum likelihood estimate and method of moments?

The maximum likelihood estimate and method of moments are both methods used to estimate the parameters of a model. The main difference is that the maximum likelihood estimate uses the likelihood function to find the best fit parameters, while the method of moments uses the moments of the data to estimate the parameters.

What are the assumptions of the maximum likelihood estimate?

The maximum likelihood estimate assumes that the observed data is a random sample from a population with a known probability distribution. It also assumes that the parameters of the model are independent and have a unique maximum likelihood value.

What are the advantages of using maximum likelihood estimate?

The maximum likelihood estimate has several advantages, including being a widely used and accepted method for parameter estimation, having good statistical properties, and being relatively simple to calculate. It also allows for the comparison of different models using the likelihood ratio test.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
731
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
Back
Top