Distribution of Difference of 2 2nd Degree Non-Central Chi Squared RVs

Click For Summary

Discussion Overview

The discussion centers around the distribution of the difference between two second degree non-central chi-squared random variables, particularly in the context of hypothesis testing involving multivariate normal distributions. Participants explore various mathematical formulations and implications of this distribution, including its representation as a quadratic form and its relationship to likelihood ratio tests.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant introduces the problem as a quadratic form and questions potential simplifications when considering the difference of two non-central chi-squared distributions.
  • Another participant corrects the initial formulation, clarifying that the random vector is real rather than complex, and expresses interest in the distribution of the difference of two chi-squared variates with one degree of freedom.
  • Several participants note the ambiguity in the original post regarding the independence of the variables involved and the implications for their joint distribution.
  • One participant suggests that numerical approximation methods, such as random number generation, could be used to estimate the cumulative distribution function (CDF) of the difference.
  • Another participant discusses the generalized likelihood ratio test (LRT) and its formulation, questioning the relationship between the quadratic form and the distribution of chi-squared variables.
  • There is a discussion about the complexity introduced when maximum likelihood estimates depend on the data, raising questions about the nature of the likelihood function.

Areas of Agreement / Disagreement

Participants express varying degrees of understanding and interpretation of the problem, with some clarifying points while others raise questions about the assumptions and formulations presented. No consensus is reached regarding the implications of the generalized likelihood ratio test or the independence of the random variables.

Contextual Notes

Participants highlight limitations in the clarity of the original problem statement, particularly regarding the definitions and relationships between the random variables involved. The discussion remains open-ended with unresolved mathematical steps and assumptions.

rhz
Messages
11
Reaction score
0
Distribution of difference of two second degree non central chi squared random variables.

This problem can be cast as an indefinite quadratic form for which there are a number of general numerical techniques to determine the CDF. Alternatively, it may be written as a linear combination of independent chi squared random variables.

I'm wondering if there are any simplifications when the linear combination takes the form
of a simple difference of two second degree non-central chi squared distributions.

Context: Consider a two dimensional complex normal random vector x = [x1 x2]' ~ CN(u,R).

I am interested in the distribution of:

|x1|^2 - |x2|^2


Thanks! :smile:
 
Physics news on Phys.org
Sorry, I formulated my problem incorrectly.

Correction. Sorry, but I have an error in my problem formulation. The random vector x is real in my case, not complex. So, I'm interested in the distribution of the difference of two chi square variates which each have one degree of freedom.
 


Anyone?
 


Ummm...5.

Its a difficult question, wait for an expert to get to it. Don't bump your post for 24 hours, ask these questions soon so someone can get around to answering them.
 


The original post has a misleading title vis-a-vis the problem that is stated in the last few lines. It also isn't clear what is meant by a "CN" distribution. For example, a Wikipedia article points out that in signal processing, "complex normal" often means "circular complex normal", i.e. that the components of the vector are independent. The subsequent correction is that the variables X1 and X2 are real hardly clears anything up about their joint distribution. Are they independent or not?
 


Stephen Tashi said:
The original post has a misleading title vis-a-vis the problem that is stated in the last few lines. It also isn't clear what is meant by a "CN" distribution. For example, a Wikipedia article points out that in signal processing, "complex normal" often means "circular complex normal", i.e. that the components of the vector are independent. The subsequent correction is that the variables X1 and X2 are real hardly clears anything up about their joint distribution. Are they independent or not?

Hi,

I'm really just interested in the distribution of the difference of two non-central chi-square random variables each with one degree of freedom.

Thanks.
 


rhz said:
Hi,

I'm really just interested in the distribution of the difference of two non-central chi-square random variables each with one degree of freedom.

Thanks.

Simple random number generation can give you a numerical approximation of the CDF. It may seem crude at first sight but has the advantage of being flexible if later you want to change the distributional or dependence assumptions.

There are lots of other methods that could be used (even under the very restrictive assumption that X1 and X2 are the squares of two independent normal variables with known parameters). To narrow it down, could you say a bit more about the purpose of the exercise?
 


There are lots of other methods that could be used (even under the very restrictive assumption that X1 and X2 are the squares of two independent normal variables with known parameters). To narrow it down, could you say a bit more about the purpose of the exercise?[/QUOTE]


Hi,

Thanks for your interest. Here is the full context. Consider a hypothesis testing problem. I have a vector of data x which is drawn from one of two families of normal distributions:


H_i: x~N(m_i,I)

where N(m,R) is a multivariate normal distribution with mean m and covariance R, and I is the identity matrix.

Under each hypothesis, the mean is known to within a multiplicative factor:

m_i = a*mm_i where a is a deterministic unknown scale and mm_i is a known vector.

The generalized likelihood ratio test for this problem takes the following form:

L(x) = x'*A*x

A = m_0*m_0'/m_0'*m_0 - m_1*m_1'/m_1'*m_1

which is the difference of two (dependent) non-central chi square random variable each with one degreee of freedom. It is straightforward to transform this to the difference of two independent non-central chi square random variable each with one degreee of freedom.

I hope that I've explained this well. If any additional info would be useful, just let me know.

Thanks again.
 


rhz said:
...
L(x) = x'*A*x

A = m_0*m_0'/m_0'*m_0 - m_1*m_1'/m_1'*m_1

which is the difference of two (dependent) non-central chi square random variable each with one degreee of freedom.
...

Not sure I fully understand this step - I would've thought that (1/2)*x'*A*x is the difference of two central chi-squares with n degrees of freedom (if we express the log of the likelihood ratio in terms of the residuals e=x-a*mm ) ?
 
  • #10


bpet said:
Not sure I fully understand this step - I would've thought that (1/2)*x'*A*x is the difference of two central chi-squares with n degrees of freedom (if we express the log of the likelihood ratio in terms of the residuals e=x-a*mm ) ?

The _generalized_ LRT re-inserts maximum likelihood estimates of unknown parameters back into the likelihood function under each hypothesis and then takes the ratio of these two functions. Even if all parameters were known the residuals would only be zero mean under one of the two hypotheses.

Thanks
 
  • #11


rhz said:
The _generalized_ LRT re-inserts maximum likelihood estimates of unknown parameters back into the likelihood function under each hypothesis and then takes the ratio of these two functions. Even if all parameters were known the residuals would only be zero mean under one of the two hypotheses.

Thanks

Agreed - so if the terms m_0 and m_1 themselves depend on x (being MLE estimates) does that make L more complicated than just a quadratic function of x?
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 5 ·
Replies
5
Views
4K