A simple baysian question on likelihood

  • Context: Graduate 
  • Thread starter Thread starter ghostyc
  • Start date Start date
  • Tags Tags
    Likelihood
Click For Summary
SUMMARY

This discussion centers on the likelihood function in Bayesian statistics, specifically for independent and identically distributed random variables \(X_1, \ldots, X_n\) with a parameter \(\theta > 0\). The likelihood function is expressed as \(l(\theta) = \frac{e^{\theta S}}{(1 + e^\theta)^n}\), where \(S = \sum x_i\). Participants seek clarity on deriving general expressions for the probability density function and the posterior distribution, as well as proving inequalities involving the likelihood function.

PREREQUISITES
  • Understanding of Bayesian statistics and likelihood functions
  • Familiarity with probability density functions and their properties
  • Knowledge of calculus for optimization problems
  • Experience with discrete random variables and their distributions
NEXT STEPS
  • Study the derivation of likelihood functions in Bayesian inference
  • Learn about posterior distributions and their applications in Bayesian analysis
  • Explore the use of calculus in maximizing likelihood functions
  • Investigate the differences between discrete and continuous probability distributions
USEFUL FOR

Statisticians, data scientists, and researchers involved in Bayesian analysis and those looking to deepen their understanding of likelihood functions and their applications in statistical modeling.

ghostyc
Messages
25
Reaction score
0
Hi all,

attachment.php?attachmentid=35247&stc=1&d=1304691641.jpg


In this question, I found that

[tex]\Pr(X_i|\theta)=\frac{\exp(\theta x_i)}{1+\exp(\theta)}[/tex]

and I carry on with the likelihood being [tex]\frac{\exp(\theta \sum x_i)}{(1+\exp(\theta))^n}[/tex]

and so [tex]s=\sum x_i = Tn[/tex]

I need some help with part (c).

========================================
attachment.php?attachmentid=35248&d=1304691641.jpg



How do I start with this question then?

I couldn't get a general expression for [tex]\Pr(X_i|\theta)[/tex].


How do I generally deal with these "piecewise probability density function"?

Thanks!
 

Attachments

  • 6.jpg
    6.jpg
    32.3 KB · Views: 477
  • 3.jpg
    3.jpg
    18.6 KB · Views: 465
Physics news on Phys.org
Suppose that [itex]X_1,...,X_n[/itex] are independent and identically distributed conditional on a random parameter [itex]\theta > 0[/itex] such that
[tex]P(X_i = 1|\theta) = \frac{e^\theta}{1+e^\theta}[/tex] and [tex]P(X_i = 0| \theta) = 1 - P(X_i=1| \theta)[/tex].

A normal prior for [itex]\theta[/itex] is taken with mean [itex]\mu[/itex] and variance [itex]\sigma^2[/itex].

(a) Show that the liklihood function is of this type

[tex]l(\theta) = \frac{e^{\theta S}}{(1 + e^\theta)^n} = \big{(}\frac{e^{\theta T}}{1 + e^\theta}\big{)}^n[/tex] and find [itex]S[/itex] and [itex]T[/itex] in terms of [itex]\{X_1,X_2,...X_n\}[/itex].

(b) Hence, write down the posterior distribution for [itex]\theta[/itex].

(c) For any [itex]0 < t < 1[/itex] and [itex]\theta > 0[/itex], show that

[tex]\frac{e^{\theta t }}{1 + e^\theta} < t^t(1-t)^{1-t}[/tex] .

In this question, I found that

[tex]\Pr(X_i|\theta)=\frac{\exp(\theta x_i)}{1+\exp(\theta)}[/tex]

and I carry on with the likelihood being [tex]\frac{\exp(\theta \sum x_i)}{(1+\exp(\theta))^n}[/tex]

and so [tex]s=\sum x_i = Tn[/tex]

I need some help with part (c).

One thought that comes to mind is to use calculus to see if the function
[tex]f(\theta) = (\theta)^S ( 1 - \theta)^{n-S}[/tex] takes its maximum value when [tex]\theta = S/N[/tex]. If so, then [tex](f(\theta))^{1/n}[/tex] also takes its maximum value there.
 
---------------
Suppose that [itex]X_1,...,X_n[/itex] are independent and identically distributed conditional on a random parameter [itex]\theta > 0[/itex] such that
for each [itex]i = 1,2,...n[/itex]

[tex]P(X_i = -1|\theta) = \theta}[/tex] and [tex]P(X_i = 1| \theta) = 1 - \theta[/tex].

(a) Show that the liklihood function is given by the form

[tex]l(\theta) = \theta^{n/2 - S} (1-\theta)^{n/2 + S}[/tex] and find [itex]S[/itex] in terms of [itex]\{X_1,X_2,...X_n\}[/itex].
---------------

let [itex]k [/tex] be the number of the [itex]X_i[/itex] that are +1 and let [itex]S = n/2 - k[/itex]<br /> <br /> Isn't "discrete" a better term than "piecewise"?[/itex]
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
1
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K