Factorisation Theorem for Sufficiency

  • Thread starter Coolster7
  • Start date
  • Tags
    Theorem
In summary, the factorisation theorem states that if under certain regularity conditions T(X) is sufficient for θ ⇔ f(x|θ) = h(x)g(t(x),θ) then for some functions h and g. However, for the Poisson distribution, 1/(x1!x2!) x e-2θθx1+x2 is sufficient, whereas for the exponential distribution, f(x|θ) = θn(1-θ)\sum(xi) - n is necessary.
  • #1
Coolster7
14
0
I'm having trouble with applying this theorem to likelihood functions in order to obtain a sufficiency statistic for the relevant variables.

_________________________________________________________________________________________

The factorisation theorem being:

Under certain regularity conditions;

T(X) is sufficient for θ ⇔ f(x|θ) = h(x)g(t(x),θ)

for some functions h and g.

__________________________________________________________________________________________

The main problem I'm having is when to allow h(x) = 1.

For example in the exponential distribution you get a likelihood function: f(x|θ) = θn(1-θ)[itex]\sum[/itex](xi) - n

you set h(x) =1 here and g(x) = (t,θ) = θn(1-θ)t - n where t = [itex]\sum[/itex](xi)

-----------------------------------------------------------------------------------------------------------------------------

However, in the Poisson distribution you get a likelihood function: 1/(x1!x2!) x e-2θθx1+x2.

here you set h(x) = 1/(x1!x2!) and not h(x) = 1.

Is this because h(x) has to be a constant or involving just x?

------------------------------------------------------------------------------------------------------------------------------

So say for example you had a likelihood function:

f(x|θ) = xnσ-2ne-0.5nx2σ-2

using the Factorisation method would you let h(x) = 1 with g(x, σ) = f(x|θ)

and say x is sufficient for σ

OR would you let h(x) = xn and g(x, σ) = σ-2ne-0.5nx2σ-2

and say x is sufficient for σ

Note: Obviously there is the same outcome for the sufficiency statistic, but in a different problem this may not be the case.

Can anyone help me please?
 
Physics news on Phys.org
  • #2
Coolster7 said:
However, in the Poisson distribution you get a likelihood function: 1/(x1!x2!) x e-2θθx1+x2.

here you set h(x) = 1/(x1!x2!) and not h(x) = 1.

Is this because h(x) has to be a constant or involving just x?

The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.
 
  • Like
Likes 1 person
  • #3
h6ss said:
The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.

Ah I see, thanks for this I understand now.. it's simple really.
 
  • #4
h6ss said:
The idea is to have h(x) not depending on the parameter. Whether it's a constant or not.

Actually just one more question. What about n? Would n need to be sufficient for the parameter or is it treated as constant/number?
 
  • #5
Coolster7 said:
Actually just one more question. What about n? Would n need to be sufficient for the parameter or is it treated as constant/number?
Anyone?
 

1. What is the Factorisation Theorem for Sufficiency?

The Factorisation Theorem for Sufficiency is a fundamental principle in statistics that states that the joint probability distribution of a set of random variables can be factored into the product of a function of the sufficient statistic and a function of the remaining variables. It provides a way to simplify complex probability distributions and identify the most relevant information for making statistical inferences.

2. How does the Factorisation Theorem for Sufficiency relate to statistical inference?

The Factorisation Theorem for Sufficiency is closely related to statistical inference because it allows us to identify the most relevant information from a set of data. By reducing a complex probability distribution to a simpler form, we can focus on the key variables and make more accurate inferences about the population from which the data was collected.

3. What is a sufficient statistic?

A sufficient statistic is a function of a random sample that contains all the information necessary for making statistical inferences about a population. It summarizes the essential features of the data in a concise manner, and its value does not depend on any unknown parameters. In other words, knowing the value of a sufficient statistic is enough to make accurate inferences about the population without needing to know the individual data points.

4. How is the Factorisation Theorem for Sufficiency used in practice?

In practice, the Factorisation Theorem for Sufficiency is used to determine which variables are most relevant for making statistical inferences. By identifying a sufficient statistic, we can reduce the dimensionality of the data and simplify complex probability distributions. This makes it easier to analyze the data, perform hypothesis tests, and make predictions about the population.

5. Can the Factorisation Theorem for Sufficiency be applied to any type of data?

Yes, the Factorisation Theorem for Sufficiency can be applied to any type of data, as long as the data follows a probability distribution. It is a general principle in statistics and is commonly used in a wide range of fields, including biology, physics, economics, and social sciences. However, the application of the theorem may differ depending on the type of data and the specific problem at hand.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
54
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
467
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
958
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
Back
Top