Convolution of iid non central Chi square and normal distribution

AI Thread Summary
The discussion focuses on the challenge of convoluting independent identically distributed (iid) non-central chi-square and normal distributions. The user has attempted to use characteristic functions and inverse Fourier transforms but is struggling with the latter. Suggestions include calculating the moment-generating function (MGF) by multiplying the MGFs of both distributions and using it to derive the probability density function (PDF). Alternative approaches such as term-by-term integration and Taylor series expansion are recommended for handling complex analytic distributions. The importance of re-normalizing any approximated PDF to ensure it maintains proper properties is also emphasized.
mmmly2002
Messages
2
Reaction score
0
Hi, I am doing research and I am stuck at this point I need help to convolute iid non central chi-square with normal distribution.
 
Physics news on Phys.org
Hey mmmly2002 and welcome to the forums.

Can you elaborate on what part you are stuck on? Have you set up the convolution equation? What approaches have you tried? Straight convolution? MGF approach?
 
Thank you for your reply...I really appreciate your help. actually the approach that I used is to take the characteristic function for both non central chi square and normal distribution, then multiply both CF. afetr that take the inverse Fourier transform for the result their production. but I could not solve the inverse Fourier transform for their production and I got stuck at this point...

Thanks again for your help.
 
Are you calculated the PDF of the addition of the two variables?

If so what I recommend is to get the MGF by multiplying the two MGF's (assuming they are independent) and then using the characteristic function for your combined MGF to get the PDF.

Also don't rule out using a term by term integration as opposed to doing something analytically.

If the analytic distribution is extremely complicated and can't easily be expressed with the elementary functions, then what you can do is basically look at the order of the expanding taylor series centred about some point and then cut off the series when the error term (in terms of its order) is large enough.

If you want to do strict calculations, then get an approximation with the right error properties over the domain of the PDF and use that.

You should be able to pick enough terms to reduce the order and you can program a computer to calculate the first n terms and throw them in an array.

But if you use an approximated PDF, make sure you "re-normalize" it so that it has the proper properties of a PDF.
 
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top