[Statistics] Factorisation theorem proof

Click For Summary

Homework Help Overview

The discussion revolves around a step in the proof of the factorization theorem in statistics, specifically focusing on justifying a particular equality involving probability densities of random variables.

Discussion Character

  • Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants explore the justification for the first equality in the theorem, questioning how it relates to sufficient statistics and the definitions of the involved notation.

Discussion Status

Some participants have provided explanations regarding the notation and the concept of partitioning events into mutually exclusive sets. The original poster expresses uncertainty about the adequacy of the justification provided.

Contextual Notes

The discussion references a specific theorem and its demonstration, indicating that the original poster is working from a source not directly related to their course material.

Heidrun
Messages
6
Reaction score
0
Hello. I have a question about a step in the factorization theorem demonstration.

1. Homework Statement

Here is the theorem (begins end of page 1), it is not my course but I have almost the same demonstration : http://math.arizona.edu/~jwatkins/sufficiency.pdf
Screenshot of it:
591799factorization.png


Homework Equations


Could someone please explain me how to justify the first equality of that step?
891254factorization2.png


The Attempt at a Solution


I think a possible justification is because the sample is a sufficient statistic but it feels like it's not enough/not the right justification
 
Physics news on Phys.org
Heidrun said:
Could someone please explain me how to justify the first equality of that step?
891254factorization2.png

The first equality is f_{T(X)}(t|\theta) = \sum_{\tilde{x}:T(\tilde{x}) = t} f_X(\tilde{x}|\theta). \ Is that what you're asking about ?
 
Stephen Tashi said:
The first equality is f_{T(X)}(t|\theta) = \sum_{\tilde{x}:T(\tilde{x}) = t} f_X(\tilde{x}|\theta). \ Is that what you're asking about ?

Yes exactly. I don't see why we can write it.
 
That equality is due to the definitions associated with the notation being used plus the fact that the probability of an event can be expressed as the sum of the probabilities of events that partition it into mutually exclusive sets.

f_{T(X)}(t|\theta) denotes the probability density of the discrete random variable T(X) on the probability space "X given \theta".

The event T(X) = t in that probability space is exactly the event that X takes on some value that makes T(X) = t. The notation "\tilde{x}: T(\tilde{x}) = t denotes that event expressed in terms of a variable \tilde{x}. The notation f_X(\tilde{X}|\theta) says we are assigning probability to that event using the probability density function defined on the probability space of "X given \theta".

For example, if event A can be partititoned into mutually exclusive events A1, A2 then p(A) = p(A1) + p(A2). If A is the event T(X) = 4 and this can be partitioned into the mutually exclusive events X = 2 and X = -2 then f_{T(X)} (4) = f_X(2) + f_X(-2).
 
  • Like
Likes   Reactions: Heidrun
Allright thanks a lot!
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 20 ·
Replies
20
Views
9K
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K