Characteristic function of joint distribution

Click For Summary

Discussion Overview

The discussion revolves around the concept of the joint characteristic function of two non-independent probability distributions: a normal distribution with mean 0 and variance n, and a chi-squared distribution with n degrees of freedom. Participants explore how to derive the joint characteristic function and clarify the relationships between the distributions.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested

Main Points Raised

  • Some participants seek clarification on what constitutes a "joint characteristic function" and how to compute it for the specified distributions.
  • One participant explains that the joint characteristic function can be expressed as an integral involving the joint distribution's density function, noting that for independent variables, the joint characteristic function is the product of the marginal characteristic functions.
  • Another participant suggests that if the variables are not independent, the joint density can be expressed in terms of conditional densities, but raises concerns about the lack of information regarding correlation.
  • Some participants discuss the implications of the chi-squared distribution being central or non-central and how this affects the integration process.
  • A later reply clarifies that the original question involves independent standard normal variables, with X being their sum and Y being their square-sum, which alters the approach to finding the joint characteristic function.
  • Participants express uncertainty about the implications of the variance of the normal distribution equating to the degrees of freedom of the chi-squared distribution and whether this indicates a specific relationship between the two distributions.

Areas of Agreement / Disagreement

There is no consensus on the correct approach to deriving the joint characteristic function due to differing interpretations of the original problem and the relationships between the distributions. Multiple competing views and uncertainties remain throughout the discussion.

Contextual Notes

Participants note limitations in the information provided, particularly regarding correlation and the nature of the chi-squared distribution. The discussion also reflects on the assumptions made about the independence of the variables and the implications of those assumptions on the calculations.

shoplifter
Messages
26
Reaction score
0
What exactly is a "joint characteristic function"? I want the characteristic function of the joint distribution of two (non-independent) probability distributions. I'll state the problem below for clarity. So my two distributions are the normal distribution with mean 0 and variance n, and the chi squared distribution with n degrees of freedom. I know their individual characteristic functions, but how do I proceed?
 
Physics news on Phys.org
shoplifter said:
What exactly is a "joint characteristic function"? I want the characteristic function of the joint distribution of two (non-independent) probability distributions. I'll state the problem below for clarity. So my two distributions are the normal distribution with mean 0 and variance n, and the chi squared distribution with n degrees of freedom. I know their individual characteristic functions, but how do I proceed?


The characteristic function is the Fourier transform of the PDF (it can also be derived if no PDF exists). The distribution of the sum of PDFs can be obtained through the product of the respective CFs.

You could just multiply the relevant CFs but I'm not sure this is correct in your case. For two non independent Gaussian distributions the product formula includes the correlation coefficient. Since the \chi^{2} approaches the normal I would recommend adding the smaller sample to the larger if possible and treat it as from a single univariate Gaussian population . If you can't do this, I would question why you would want to evaluate a bivariate distribution where the samples are apparently not compatible.
 
Last edited:
The joint characteristic function is

<br /> \phi_{X,Y}(s,t) = \iint e^{i(sx + ty)} \,dF(x,y) = \iint e^{i(sx + ty)} f(x,y) \, dx dy<br />

(the latter only if the joint distribution is continuous so that there is a density). If the variables are independent, the joint c.f. is the product of the marginal c.f.s; that isn't your case.

You state that X is normal with mean 0 and variance n, and Y is chi-square with n degrees of freedom. If that means this:
The distribution of X given Y is normal, \mu = 0, \sigma^2 = n , you can do this.

As noted, the joint c.f. is

<br /> \phi_{X,Y}(s,t) = \iint e^{i(sx + ty)} \,dF(x,y) = \iint e^{i(sx + ty)} f(x,y) \, dx dy<br />

In your case the joint density isn't the product of the marginals, but you can write

<br /> f(x,y) = f(x|Y=y) \cdot g(y)<br />

where f(x|Y=y) is the a normal density with mean 0 and variance n, and g(y) is the density for the chi-square distribution with n degrees of freedom. Then

<br /> \phi_{X,Y} = \iint e^{i(sx+ty)} \,f(x,y) dx dy = \int\left(\int e^{isx} f(x|Y=y) \,dx\right) e^{ity} g(y) \, dy<br />

The expression in the inner integral is simply the c.f. for the normal distribution (mean = 0, variance = n), so you can evaluate that immediately. What's left is to take the integral of that with respect to the chi-square density.
 
statdad said:
<br /> \phi_{X,Y} = \iint e^{i(sx+ty)} \,f(x,y) dx dy = \int\left(\int e^{isx} f(x|Y=y) \,dx\right) e^{ity} g(y) \, dy<br />

The expression in the inner integral is simply the c.f. for the normal distribution (mean = 0, variance = n), so you can evaluate that immediately. What's left is to take the integral of that with respect to the chi-square density.

The reason I didn't suggest something like this was that we don't know the correlation. \rho is generally assumed to be valid for normally distributed data. We don't even know if the chi square is central or non central. How would you handle this?
 
Basically I gambled. I took the OP's post as giving all relevant information - that one distribution was normal, \mu = 0, \sigma^2 = n, the other \chi^2 with n degrees of freedom.
In a sense, since we don't have two normal distributions, the n is like the correlation coefficient.
Also, if the chi-square distribution is non-central, the only thing that changes is that the second of the two integrations becomes more difficult.

I must admit one more thing in addition to my gamble: I guessed (I don't think they are the same thing here.) It is rather common for questions in a similar vein to be given when both distributions are discrete, with the only link of dependence being the item specified. I guessed the same case would hold here.

So there you have it. If the details in the first post were complete, I'm okay. If they weren't, what's missing will be supplied.

If you have a different take I'd be interested. I hope I haven't overstepped bounds by doing this.
 
statdad said:
In a sense, since we don't have two normal distributions, the n is like the correlation coefficient.
Also, if the chi-square distribution is non-central, the only thing that changes is that the second of the two integrations becomes more difficult.

So there you have it. If the details in the first post were complete, I'm okay. If they weren't, what's missing will be supplied.

If you have a different take I'd be interested. I hope I haven't overstepped bounds by doing this.

No problem. I was just thinking of this as a problem in applied statistics. The chi square would apply to a small sample. It just seemed odd to try define a bivariate distribution in these terms, especially when they are termed "non independent". An explicit expression for the joint characteristic function for two non independent Gaussian PDFs is:

\phi(t_{1},t_{2})=exp[i(t_{1}\mu_{1}+t_{2}\mu_{2})-1/2(\sigma_{1}^{2}t_{1}^{2}+2\rho \sigma_{1} \sigma_{2} t_{1}t_{2}+\sigma_{2}^{2} t_{2}^{2})]

I also didn't know what the OP meant by a variance of n. Does that convey something about the relationship with the chi square?
 
The variance of the normal distribution equals the number of degrees of freedom of the chi square distribution.
 
statdad said:
The variance of the normal distribution equals the number of degrees of freedom of the chi square distribution.

OK I understand that, but I was under the impression the OP was talking about two distinct distributions. Are we to assume that k=n in this case?

EDIT: Whoops. I see it. The OP defined k=n.
 
Last edited:
thank you for your detailed responses. However, the original question I am trying to solve does not say "X is normal given that Y is chi-squared". It says something like, okay, here are n identically distributed independent standard normal variables, and let X be their sum, and Y be their square-sum (which is why I stated that X is N(0, n) and Y is chi-squared with n degrees of freedom). Then find the characteristic function of the joint distribution of X and Y. That changes things quite a bit, right?
 
  • #10
shoplifter said:
thank you for your detailed responses. However, the original question I am trying to solve does not say "X is normal given that Y is chi-squared". It says something like, okay, here are n identically distributed independent standard normal variables, and let X be their sum, and Y be their square-sum (which is why I stated that X is N(0, n) and Y is chi-squared with n degrees of freedom). Then find the characteristic function of the joint distribution of X and Y. That changes things quite a bit, right?

That really isn't what you wrote in your original question. Instead of "something like..." can you post the exact wording?
 
  • #11
yes, I apologize. Suppose A_1, ..., A_n are iid standard normal variables, and say X = A_1 + ... + A_n, and Y = A_1^2 + ... + A_n^2. Then what's the char. func. of the joint probability distribution of X and Y?

Apologies again for not being clear before.
 
  • #12
shoplifter said:
yes, I apologize. Suppose A_1, ..., A_n are iid standard normal variables, and say X = A_1 + ... + A_n, and Y = A_1^2 + ... + A_n^2. Then what's the char. func. of the joint probability distribution of X and Y?

Apologies again for not being clear before.

Here it's easier to use the definition

\phi_{X,Y}(s,t) = \mathbb{E}[e^{isX+itY}]

which reduces to

\mathbb{E}[e^{isA_1+itA_1^2}]^n

by independence. The latter expectation expressed as an integral can be solved by completing the square.
 
  • #13
so I get the characteristic function to be \mathbb{E}e^{(-ins^2/4t + in(A_1\sqrt{t} + s/2\sqrt{t})^2)}. I'm guessing we can take the first (constant) term out of the expectation, as \mathbb{E}(c) = c. But I don't see an immediate way to calculate the second term, because the integral is too unwieldy. Any help would be much appreciated.

As a second small question, what exactly does this quantity measure?
 
  • #14
For the answer (which is \mathbb{E}[e^{isX+itY}]), I am getting the following quantity raised to power n:

\frac{1}{2\pi}\int_{-\infty}^\infty e^{isx + itx^2}e^{-x^2/2}dx

Is this correct? Thanks.
 
  • #15
Sorry, the previous post doesn't seem to display equations correctly: I meant, I found the value of \mathbb{E}[e^{isX+itY}] to be

\frac{1}{2\pi}\int_{-\infty}^\infty e^{isx + itx^2}e^{-x^2/2}dx.
 
  • #16
shoplifter said:
Sorry, the previous post doesn't seem to display equations correctly: I meant, I found the value of \mathbb{E}[e^{isX+itY}] to be

\frac{1}{2\pi}\int_{-\infty}^\infty e^{isx + itx^2}e^{-x^2/2}dx.

Instead of X,Y I think you need to use A,A^2

So \int_x e^{isA_{1}+itA_{1}^2}ndx
following bpet's suggestion.
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K