Moments from characteristic function geometric distribution

In summary: The first equation gives the probability of a single event, while the second equation gives the probability of a sequence of n events.
  • #1
binbagsss
1,254
11

Homework Statement



Hi,

I have the probabilty density: ##p_{n}=(1-p)^{n}p , n=0,1,2... ##

and I am asked to find the characteristic function: ##p(k)= <e^{ikn}> ## and then use this to determine the mean and variance of the distribution.

Homework Equations


[/B]
I have the general expression for the characteristic function : ##\sum\limits^{\infty}_{n=0} \frac{(-ik)^m}{m!} <x^{m}> ## * , from which can equate coefficients of ##k## to find the moments.

The Attempt at a Solution



So I have ## <e^{-ikn}>=\sum\limits^{\infty}_{n-0} (1-p)^{n}p e^{-ikn} ##

I understand the solution given in my notes which is that this is equal to, after some rearranging etc, expanding out using taylor :

## 1 + \frac{(1-p)}{ p} (-k + 1/2 (-ik)^{2} + O(k^3) ) + \frac{(1-p)^{2}} { p^2 } ( (-ik)^{2} + O(k^3))##
and then equating coefficients according to *

However my method was to do the following , and I'm unsure why it is wrong:

## <e^{-ikn}>=\sum\limits^{\infty}_{n=0} (1-p)^{n} p e^{-ikn} = \sum\limits^{\infty}_{n=0} (1-p)^{n}p \frac{(-ik)^n}{n!} ##

And so comparing to * ## \implies ##

## \sum\limits^{\infty}_{n=0} (1-p)^{n}p = \sum\limits^{\infty}_{n=0} <x^{n}> ##

Anyone tell me what I've done wrong? thank you, greatly appreciated.
 
Physics news on Phys.org
  • #2
binbagsss said:

Homework Statement



Hi,

I have the probabilty density: ##p_{n}=(1-p)^{n}p , n=0,1,2... ##

and I am asked to find the characteristic function: ##p(k)= <e^{ikn}> ## and then use this to determine the mean and variance of the distribution.

Homework Equations


[/B]
I have the general expression for the characteristic function : ##\sum\limits^{\infty}_{n=0} \frac{(-ik)^m}{m!} <x^{m}> ## * , from which can equate coefficients of ##k## to find the moments.

The Attempt at a Solution



So I have ## <e^{-ikn}>=\sum\limits^{\infty}_{n-0} (1-p)^{n}p e^{-ikn} ##

I understand the solution given in my notes which is that this is equal to, after some rearranging etc, expanding out using taylor :

## 1 + \frac{(1-p)}{ p} (-k + 1/2 (-ik)^{2} + O(k^3) ) + \frac{(1-p)^{2}} { p^2 } ( (-ik)^{2} + O(k^3))##
and then equating coefficients according to *

However my method was to do the following , and I'm unsure why it is wrong:

## <e^{-ikn}>=\sum\limits^{\infty}_{n=0} (1-p)^{n} p e^{-ikn} = \sum\limits^{\infty}_{n=0} (1-p)^{n}p \frac{(-ik)^n}{n!} ##

And so comparing to * ## \implies ##

## \sum\limits^{\infty}_{n=0} (1-p)^{n}p = \sum\limits^{\infty}_{n=0} <x^{n}> ##

Anyone tell me what I've done wrong? thank you, greatly appreciated.

Simplify:
$$p (1-p)^n e^{-ikn} = p x^n, \; \text{where} \; x = (1-p)e^{-ik}$$
Now just sum the geometric series ##\sum x^n##.
 
  • #3
Ray Vickson said:
Simplify:
$$p (1-p)^n e^{-ikn} = p x^n, \; \text{where} \; x = (1-p)e^{-ik}$$
Now just sum the geometric series ##\sum x^n##.

Yeah that's fine, and in the end you get the 'solution given' which I said I understand. That's what's been done before expanding out.

My problem was wondering what is wrong with the method I post after that...
 
Last edited:
  • #4
binbagsss said:
Yeah that's fine, and in the end you get the 'solution given' which I said I understand. That's what's been done before expanding out.

My problem was wondering what is wrong with the method I post after that...

You seem to be saying that ##p(1-p)^n e^{-ikn}## is the same as ##p(1-p)^n (-ik)^n/n!##, but this is obviously false.
 
Last edited:
  • Like
Likes binbagsss

1. What is a characteristic function?

A characteristic function is a mathematical function that describes the probability distribution of a random variable. It is the Fourier transform of the probability density function, and it is used to uniquely identify a distribution.

2. What is a geometric distribution?

A geometric distribution is a discrete probability distribution that models the number of trials needed to achieve success in a series of independent Bernoulli trials. It is often used to represent the number of failures before the first success.

3. How do moments relate to the characteristic function of a geometric distribution?

Moments are a set of statistical measures that describe the shape and central tendency of a probability distribution. They can be calculated from the characteristic function of a geometric distribution to provide information about the distribution's properties, such as its mean and variance.

4. What is the first moment of a characteristic function geometric distribution?

The first moment of a characteristic function geometric distribution is the mean, which represents the expected value of the distribution. In this case, it is equal to 1/p, where p is the probability of success on each trial.

5. How are moments from the characteristic function used in practical applications?

Moments from the characteristic function of a geometric distribution are used in various statistical analyses and modeling techniques. They can be used to estimate parameters of the distribution, make predictions, and assess the goodness of fit for a given data set.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
482
  • Calculus and Beyond Homework Help
Replies
1
Views
224
  • Calculus and Beyond Homework Help
Replies
3
Views
422
  • Calculus and Beyond Homework Help
Replies
12
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
265
  • Calculus and Beyond Homework Help
Replies
5
Views
622
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
608
  • Calculus and Beyond Homework Help
Replies
4
Views
318
  • Calculus and Beyond Homework Help
Replies
5
Views
539
Back
Top