Factorisation Theorem for Sufficiency

  • Thread starter Thread starter Coolster7
  • Start date Start date
  • Tags Tags
    Theorem
Coolster7
Messages
14
Reaction score
0
I'm having trouble with applying this theorem to likelihood functions in order to obtain a sufficiency statistic for the relevant variables.

_________________________________________________________________________________________

The factorisation theorem being:

Under certain regularity conditions;

T(X) is sufficient for θ ⇔ f(x|θ) = h(x)g(t(x),θ)

for some functions h and g.

__________________________________________________________________________________________

The main problem I'm having is when to allow h(x) = 1.

For example in the exponential distribution you get a likelihood function: f(x|θ) = θn(1-θ)\sum(xi) - n

you set h(x) =1 here and g(x) = (t,θ) = θn(1-θ)t - n where t = \sum(xi)

-----------------------------------------------------------------------------------------------------------------------------

However, in the Poisson distribution you get a likelihood function: 1/(x1!x2!) x e-2θθx1+x2.

here you set h(x) = 1/(x1!x2!) and not h(x) = 1.

Is this because h(x) has to be a constant or involving just x?

------------------------------------------------------------------------------------------------------------------------------

So say for example you had a likelihood function:

f(x|θ) = xnσ-2ne-0.5nx2σ-2

using the Factorisation method would you let h(x) = 1 with g(x, σ) = f(x|θ)

and say x is sufficient for σ

OR would you let h(x) = xn and g(x, σ) = σ-2ne-0.5nx2σ-2

and say x is sufficient for σ

Note: Obviously there is the same outcome for the sufficiency statistic, but in a different problem this may not be the case.

Can anyone help me please?
 
Physics news on Phys.org
Coolster7 said:
However, in the Poisson distribution you get a likelihood function: 1/(x1!x2!) x e-2θθx1+x2.

here you set h(x) = 1/(x1!x2!) and not h(x) = 1.

Is this because h(x) has to be a constant or involving just x?

The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.
 
  • Like
Likes 1 person
h6ss said:
The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.

Ah I see, thanks for this I understand now.. it's simple really.
 
h6ss said:
The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.

Actually just one more question. What about n? Would n need to be sufficient for the parameter or is it treated as constant/number?
 
Coolster7 said:
Actually just one more question. What about n? Would n need to be sufficient for the parameter or is it treated as constant/number?
Anyone?
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top