- #1

Coolster7

- 14

- 0

_________________________________________________________________________________________

The factorisation theorem being:

Under certain regularity conditions;

**T(X)**is sufficient for θ ⇔ f(

**x**|θ) = h(

**x**)g(

**t**(

**x**),θ)

for some functions h and g.

__________________________________________________________________________________________

The main problem I'm having is when to allow h(x) = 1.

For example in the exponential distribution you get a likelihood function: f(

**x**|θ) = θ

^{n}(1-θ)

^{[itex]\sum[/itex](xi) - n}

you set h(x) =1 here and g(x) = (t,θ) = θ

^{n}(1-θ)

^{t - n}where t = [itex]\sum[/itex](x

_{i})

-----------------------------------------------------------------------------------------------------------------------------

However, in the Poisson distribution you get a likelihood function: 1/(x

_{1}!x

_{2}!) x e

^{-2θ}θ

^{x1+x2}.

here you set h(x) = 1/(x

_{1}!x

_{2}!) and not h(x) = 1.

Is this because h(x) has to be a constant or involving just x?

------------------------------------------------------------------------------------------------------------------------------

So say for example you had a likelihood function:

f(

**x**|θ) = x

^{n}σ

^{-2n}e

^{-0.5nx2σ-2}

using the Factorisation method would you let h(x) = 1 with g(x, σ) = f(x|θ)

and say x is sufficient for σ

OR would you let h(x) = x

^{n}and g(x, σ) = σ

^{-2n}e

^{-0.5nx2σ-2}

and say x is sufficient for σ

Note: Obviously there is the same outcome for the sufficiency statistic, but in a different problem this may not be the case.

Can anyone help me please?