Prove Chi Square is Stochastically Increasing

Mogarrr
Messages
120
Reaction score
6

Homework Statement


Prove that the X^2 distribution is stochastically increasing in its degrees of freedom; that is if p>q, then for any a, P(X^2_{p} > a) \geq P(X^2_{q} > a), with strict inequality for some a.

Homework Equations


1.(n-1)S^2/\sigma^2 \sim X^2_{n-1}
2.The Chi squared(p) pdf is
f(x|p)= \frac 1{\Gamma(p/2) 2^{p/2}}x^{(p/2) - 1}e^{-x/2}

The Attempt at a Solution


Since p>q, this implies \forall a, \frac{\sigma^2 a}{p}< \frac{\sigma^2 a}{q}.
Also, X^2_{k} \sim kS^2/\sigma^2.

Therefore \forall a, P(X^2_{p}>a) = P(S^2 > \sigma^2 a/p) \geq P(S^2 > \sigma^2 a/q) = P(X^2_{q}>a).

If a>0, we observe strict inequality, as the support of S^2 is [0,\infty)...

What do you think? If I am going in the wrong direction, please steer me in the right one.
 
Physics news on Phys.org
Mogarrr said:

Homework Statement


Prove that the X^2 distribution is stochastically increasing in its degrees of freedom; that is if p>q, then for any a, P(X^2_{p} > a) \geq P(X^2_{q} > a), with strict inequality for some a.

Homework Equations


1.(n-1)S^2/\sigma^2 \sim X^2_{n-1}
2.The Chi squared(p) pdf is
f(x|p)= \frac 1{\Gamma(p/2) 2^{p/2}}x^{(p/2) - 1}e^{-x/2}

The Attempt at a Solution


Since p>q, this implies \forall a, \frac{\sigma^2 a}{p}< \frac{\sigma^2 a}{q}.
Also, X^2_{k} \sim kS^2/\sigma^2.

Therefore \forall a, P(X^2_{p}>a) = P(S^2 > \sigma^2 a/p) \geq P(S^2 > \sigma^2 a/q) = P(X^2_{q}>a).

If a>0, we observe strict inequality, as the support of S^2 is [0,\infty)...

What do you think? If I am going in the wrong direction, please steer me in the right one.

I cannot follow your argument. You need two different ##S^2## random variables---one for ##\chi^2_p## and a different one for ##\chi^2_q##. Basically, though, you need to know what ##\chi^2## really is, in simple, intuitive terms: if ##Y_r## has the distribution ##\chi^2_r## then
Y_r = Z_1^2 + Z_2^2 + \cdots + Z_r^2,
where ##Z_1, Z_2, \ldots, Z_r## are iid standard normal random variables.

When looking at stochastic ordering, you are entitled to use a common sample space ##\Omega##, since all that matters is how the distribution functions compare (not, for example, whether the two random variables are independent, or not). We can always "construct" a sample space such that the iid N(0,1) random variables ##Z_1, Z_2, \ldots, Z_p## are functions over ##\Omega## (so their values "observed" on a sample point ##\omega \in \Omega##) are ##Z_1(\omega), Z_2(\omega), \ldots, Z_p(\omega)##. Then
Y_q(\omega) = \sum_{j=1}^q Z_j(\omega)^2 \; \text{and} \;Y_p(\omega) = \sum_{j=1}^p Z_j(\omega)^2
For ##q < p##, what does that tell you?
 
Last edited:
Based on your hint, here's what I'm thinking...

Well, then, if Z_{i} \sim N(0,1) (and i.i.d.) , and X^2_{n} = \sum_{i=1}^{n} Z_{i}, then X^2_{p} = pZ^2_1 and X^2_{q} = qZ^2_1.

Thus for all a, P(X^2_{p} &gt; a) = P(pZ^2_1 &gt; a ) = P(Z^2_1 &gt; \frac {a}{p}) \geq P(Z^2_1 &gt; \frac {a}{q}) = P(X^2_{q} &gt; 1), since p&gt;q. If a&gt;0, then there should be strict equality (I think).

Whatdya think?
 
Mogarrr said:
Based on your hint, here's what I'm thinking...

Well, then, if Z_{i} \sim N(0,1) (and i.i.d.) , and X^2_{n} = \sum_{i=1}^{n} Z_{i}, then X^2_{p} = pZ^2_1 and X^2_{q} = qZ^2_1.

Thus for all a, P(X^2_{p} &gt; a) = P(pZ^2_1 &gt; a ) = P(Z^2_1 &gt; \frac {a}{p}) \geq P(Z^2_1 &gt; \frac {a}{q}) = P(X^2_{q} &gt; 1), since p&gt;q. If a&gt;0, then there should be strict equality (I think).

Whatdya think?

I think you are writing down a bunch of material that makes no sense at all. Chi-squared = a sum of squares of iid N(0,1) random variables, not just a sum. I suggest you sit down, relax, and proceed slowly and carefully. Think about the actual definitions, and think about what I wrote in my previous response.

Alternatively, you can try to proceed directly: ##X## (with cdf ##F##) dominates ##Y## (with cdf ##G##) if ##G(x) \geq F(x)## for all ##x##, and ##>## holds for some ##x##. In other words, the stochastically larger random variable has a smaller cdf; that is, for each ##x##, it lies below the other one---so ##F## describes a distribution the lies to the "right" of ##G##. That means that the density functions ##f,g## must satisfy ##\int_0^x [g(t) - f(t)] \geq 0 ## for all ##x > 0##. You have formulas for ##f(t)## and ##g(t)##, so you can try to verify the cumulative ordering. That might be hard to do, so that is why I suggested an alternative approach.
 
  • Like
Likes Mogarrr
Oops. Should have written X^2_{n} = \sum^{n}_{i=1} Z^2_{i}. I was thinking that since the Z_{i}'s are identically distributed, that I could just sum up their squares, X^2_1. Why am I wrong here, if they're identically distributed?

I will try the direct method, since X^2_{p} \sim \Gamma(p/2, 2).

Update: Ok, I believe I've found a solid direct proof using the Chi square, Gamma and Poisson distribution. I'm excited, but busy now, so check tomorrow for my reply! And thanks for the help.
 
Last edited:
Ok, there is a relationship between the Gamma and Poisson distribution. Let X \sim Gamma(\alpha, \beta), then
P(X \leq x) = P(Y \geq \alpha), where Y \sim Poisson(x/\beta).

Now let p &gt; q, p,q \in \mathbb{Z}^{+} a&gt;0, then
P(X^2_{p} &gt; a) &lt; P(X^2_{q} &gt; a)
\Leftrightarrow P(Y &gt; p) &lt; P(Y &gt; q)
\Leftrightarrow 0 &lt; P(Y&gt;q) - P(Y&gt;p)
\Leftrightarrow 0 &lt; \sum^{\infty}_{y=q/2} \frac {e^{-a/2}(a/2)^{y}}{y!} - \sum^{\infty}_{y=p/2} \frac{e^{-a/2}(a/2)^{p} }{y!}
= \sum^{p/2}_{y=q/2} \frac {e^{-a/2} (a/2)^{y} }{y!}.
The above is positive, proving a strict inequality for some a.

If a \notin \mathbb{R}^{+}, then
P(X^2_{p} &gt; a) = 1 = P(X^2_{q} &gt; a),
hence \forall a \in \mathbb{R}, P(X^2_{p} &gt; a) \geq P(X^2_{q} &gt; a), if p&gt;q.

I think this is right.
 
Prove $$\int\limits_0^{\sqrt2/4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx = \frac{\pi^2}{8}.$$ Let $$I = \int\limits_0^{\sqrt 2 / 4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx. \tag{1}$$ The representation integral of ##\arcsin## is $$\arcsin u = \int\limits_{0}^{1} \frac{\mathrm dt}{\sqrt{1-t^2}}, \qquad 0 \leqslant u \leqslant 1.$$ Plugging identity above into ##(1)## with ##u...
Back
Top