[Statistics] Factorisation theorem proof

Click For Summary
SUMMARY

The discussion centers on the justification of the first equality in the factorization theorem, specifically the expression f_{T(X)}(t|\theta) = ∑_{tilde{x}:T(tilde{x}) = t} f_X(tilde{x}|\theta). Participants clarify that this equality arises from the definitions of the notation and the principle that the probability of an event can be expressed as the sum of probabilities of mutually exclusive events. The explanation emphasizes the relationship between the sufficient statistic T(X) and the underlying random variable X, reinforcing the concept of partitioning events in probability theory.

PREREQUISITES
  • Understanding of the Factorization Theorem in statistics
  • Familiarity with probability density functions (PDFs)
  • Knowledge of sufficient statistics
  • Basic concepts of partitioning events in probability theory
NEXT STEPS
  • Study the Factorization Theorem in detail, focusing on its applications in statistical inference
  • Learn about probability density functions and their properties
  • Explore the concept of sufficient statistics and their role in parameter estimation
  • Review examples of partitioning events in probability to solidify understanding
USEFUL FOR

Statisticians, data scientists, and students studying advanced probability and statistical inference who seek to deepen their understanding of the Factorization Theorem and its implications in statistical analysis.

Heidrun
Messages
6
Reaction score
0
Hello. I have a question about a step in the factorization theorem demonstration.

1. Homework Statement

Here is the theorem (begins end of page 1), it is not my course but I have almost the same demonstration : http://math.arizona.edu/~jwatkins/sufficiency.pdf
Screenshot of it:
591799factorization.png


Homework Equations


Could someone please explain me how to justify the first equality of that step?
891254factorization2.png


The Attempt at a Solution


I think a possible justification is because the sample is a sufficient statistic but it feels like it's not enough/not the right justification
 
Physics news on Phys.org
Heidrun said:
Could someone please explain me how to justify the first equality of that step?
891254factorization2.png

The first equality is f_{T(X)}(t|\theta) = \sum_{\tilde{x}:T(\tilde{x}) = t} f_X(\tilde{x}|\theta). \ Is that what you're asking about ?
 
Stephen Tashi said:
The first equality is f_{T(X)}(t|\theta) = \sum_{\tilde{x}:T(\tilde{x}) = t} f_X(\tilde{x}|\theta). \ Is that what you're asking about ?

Yes exactly. I don't see why we can write it.
 
That equality is due to the definitions associated with the notation being used plus the fact that the probability of an event can be expressed as the sum of the probabilities of events that partition it into mutually exclusive sets.

f_{T(X)}(t|\theta) denotes the probability density of the discrete random variable T(X) on the probability space "X given \theta".

The event T(X) = t in that probability space is exactly the event that X takes on some value that makes T(X) = t. The notation "\tilde{x}: T(\tilde{x}) = t denotes that event expressed in terms of a variable \tilde{x}. The notation f_X(\tilde{X}|\theta) says we are assigning probability to that event using the probability density function defined on the probability space of "X given \theta".

For example, if event A can be partititoned into mutually exclusive events A1, A2 then p(A) = p(A1) + p(A2). If A is the event T(X) = 4 and this can be partitioned into the mutually exclusive events X = 2 and X = -2 then f_{T(X)} (4) = f_X(2) + f_X(-2).
 
  • Like
Likes   Reactions: Heidrun
Allright thanks a lot!
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 20 ·
Replies
20
Views
9K
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K