Factorisation Theorem for Sufficiency

  • Thread starter Thread starter Coolster7
  • Start date Start date
  • Tags Tags
    Theorem
AI Thread Summary
The discussion revolves around the application of the factorisation theorem to likelihood functions to determine sufficiency statistics. Participants explore when to set h(x) to 1, noting that in some cases, like the exponential distribution, it is appropriate, while in others, like the Poisson distribution, h(x) must involve x and not just be a constant. The key point is that h(x) should not depend on the parameter θ, which clarifies its role in the factorisation process. Additionally, there is a query about whether the sample size n should be treated as a constant or if it needs to be sufficient for the parameter. Overall, the conversation emphasizes understanding the conditions under which the factorisation theorem can be applied effectively.
Coolster7
Messages
14
Reaction score
0
I'm having trouble with applying this theorem to likelihood functions in order to obtain a sufficiency statistic for the relevant variables.

_________________________________________________________________________________________

The factorisation theorem being:

Under certain regularity conditions;

T(X) is sufficient for θ ⇔ f(x|θ) = h(x)g(t(x),θ)

for some functions h and g.

__________________________________________________________________________________________

The main problem I'm having is when to allow h(x) = 1.

For example in the exponential distribution you get a likelihood function: f(x|θ) = θn(1-θ)\sum(xi) - n

you set h(x) =1 here and g(x) = (t,θ) = θn(1-θ)t - n where t = \sum(xi)

-----------------------------------------------------------------------------------------------------------------------------

However, in the Poisson distribution you get a likelihood function: 1/(x1!x2!) x e-2θθx1+x2.

here you set h(x) = 1/(x1!x2!) and not h(x) = 1.

Is this because h(x) has to be a constant or involving just x?

------------------------------------------------------------------------------------------------------------------------------

So say for example you had a likelihood function:

f(x|θ) = xnσ-2ne-0.5nx2σ-2

using the Factorisation method would you let h(x) = 1 with g(x, σ) = f(x|θ)

and say x is sufficient for σ

OR would you let h(x) = xn and g(x, σ) = σ-2ne-0.5nx2σ-2

and say x is sufficient for σ

Note: Obviously there is the same outcome for the sufficiency statistic, but in a different problem this may not be the case.

Can anyone help me please?
 
Physics news on Phys.org
Coolster7 said:
However, in the Poisson distribution you get a likelihood function: 1/(x1!x2!) x e-2θθx1+x2.

here you set h(x) = 1/(x1!x2!) and not h(x) = 1.

Is this because h(x) has to be a constant or involving just x?

The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.
 
  • Like
Likes 1 person
h6ss said:
The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.

Ah I see, thanks for this I understand now.. it's simple really.
 
h6ss said:
The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.

Actually just one more question. What about n? Would n need to be sufficient for the parameter or is it treated as constant/number?
 
Coolster7 said:
Actually just one more question. What about n? Would n need to be sufficient for the parameter or is it treated as constant/number?
Anyone?
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top