Characteristic function of the binomial distribution

Click For Summary

Homework Help Overview

The discussion revolves around finding the characteristic function, moments, and cumulants of the binomial distribution with parameters n and p. The original poster expresses confusion about the definitions and applications of the characteristic function, particularly regarding the variable t and how to utilize the probability mass function in calculations.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the definition of the characteristic function and its application to discrete random variables, contrasting it with continuous variables. There are attempts to clarify the relationship between the characteristic function and the moment-generating function. Questions arise about the interpretation of variables and the process of calculating moments and cumulants from the characteristic function.

Discussion Status

The discussion is ongoing, with participants providing insights and suggestions for further exploration. Some guidance has been offered regarding the use of the moment-generating function as an alternative to the characteristic function for calculating moments. There is recognition of the complexity involved in finding moments and cumulants, with participants sharing their attempts and seeking clarification on specific points.

Contextual Notes

Participants note the potential confusion stemming from different notations and definitions encountered in various resources. There is mention of imposed homework rules and the challenge of aligning different approaches to the same problem.

fluidistic
Gold Member
Messages
3,934
Reaction score
286

Homework Statement


Hey guys, I'm self studying some probability theory and I'm stuck with the basics.
I must find the characteristic function (also the moments and the cumulants) of the binomial "variable" with parameters n and p.
I checked out wikipedia's article http://en.wikipedia.org/wiki/Characteristic_function_(probability_theory), apparently the solution is [itex](1-p+pe^{it})^n[/itex] though I didn't really understand what t stand for (number of successes?).

Homework Equations


Characteristic function: [itex]\int e^{ikX} P(x)dx[/itex].

The Attempt at a Solution


I'm guessing that I must simply apply the given formula. The k would be wikipedia's t variable. I'm stuck at finding P(x) and X. I've searched and found out the binomial distribution's article in wikipedia and [itex]P(K=k)=\frac{n!p^k (1-p)^{n-k}}{k!(n-k)!}[/itex] which is called the probability mass function. I don't know how how I could "plug" this into the given formula.
Thanks for any tip.
 
Physics news on Phys.org
fluidistic said:

Homework Equations


Characteristic function: [itex]\int e^{ikX} P(x)dx[/itex].

The definition of the characteristic function of a random variable X is

[tex]E[e^{ikX}][/tex]

How you calculate this expectation depends on what kind of random variable you are dealing with.

The equation you listed is for a continuous random variable, i.e. one that has a probability density function.

For a discrete random variable, with probability mass function P(X = n), the characteristic function would be

[tex]E[e^{ikX}] = \sum_{n} e^{ikn} P(X = n)[/tex]
 
fluidistic said:

Homework Statement


Hey guys, I'm self studying some probability theory and I'm stuck with the basics.
I must find the characteristic function (also the moments and the cumulants) of the binomial "variable" with parameters n and p.
I checked out wikipedia's article http://en.wikipedia.org/wiki/Characteristic_function_(probability_theory), apparently the solution is [itex](1-p+pe^{it})^n[/itex] though I didn't really understand what t stand for (number of successes?).

Homework Equations


Characteristic function: [itex]\int e^{ikX} P(x)dx[/itex].


The Attempt at a Solution


I'm guessing that I must simply apply the given formula. The k would be wikipedia's t variable. I'm stuck at finding P(x) and X. I've searched and found out the binomial distribution's article in wikipedia and [itex]P(K=k)=\frac{n!p^k (1-p)^{n-k}}{k!(n-k)!}[/itex] which is called the probability mass function. I don't know how how I could "plug" this into the given formula.
Thanks for any tip.

First tip: do more than consult Wikipedia. Get a good *book*.

Anyway, the characteristic function of the Binomial random variable [itex]X[/itex] is
[tex]\text{ch}_{X}(t) \equiv E\left(e^{iX}\right) = \sum_{k=0}^n {n \choose k} e^{i t k} p^k (1-p)^{n-k} \\<br /> = \sum_{k=0}^n {n \choose k} (pe^{it})^k (1-p)^{n-k} = (1-p + pe^{it})^n,[/tex]
using the binomial expansion
[tex](a+b)^n = \sum_{k=1}^n {n \choose k} a^k b^{n-k}.[/tex]
The "t" has nothing to do with numbers of successes, or anything; it is just a parameter used in the characteristic function.

RGV
 
Ok thank you very much guys.
By the way which book(s) would you recommend me? Because I'm totally stuck at finding the moments (I know they are the coefficients of the Taylor's expansion of the characteristic function and the cumulants which are just the logarithm of the moments). Should I just calculate the Taylor's series of [itex](1-p + pe^{it})^n[/itex] with respect to t? Is this the way to go?
 
Some news about my tries:
The n'th moment around the point "a" is definied as [itex]\mu _ n (a)=\sum (x-a)^n P(x)[/itex] where P is the probability mass function.
So if I take the binomial distribution where there is N tries with each a probability of "p" to occur, then [itex]\mu _ N (a)= \sum _{k=1}^{N} (x-a)^N {N \choose k} p ^k (1-p)^{N-k}[/itex].
I'm sure I've messed up some variables here. I'm kind of confused. Any help is appreciated.
 
fluidistic said:
Some news about my tries:
The n'th moment around the point "a" is definied as [itex]\mu _ n (a)=\sum (x-a)^n P(x)[/itex] where P is the probability mass function.
So if I take the binomial distribution where there is N tries with each a probability of "p" to occur, then [itex]\mu _ N (a)= \sum _{k=1}^{N} (x-a)^N {N \choose k} p ^k (1-p)^{N-k}[/itex].
I'm sure I've messed up some variables here. I'm kind of confused. Any help is appreciated.

It is a bit easier to use the moment-generating function (mgf) instead of the characteristic function. The mgf of any discrete random variable X is
[tex]m_X(t) = \sum_{x} p(x) e^{tx} = E\, e^{tX}.[/tex] To get moments of X about 'a', you need to find [itex]E(X-a)^k[/itex], and the easiest way is to use
[tex]m_{X,a} (t) \equiv E\, e^{(X-a)t} = \sum_{x} p(x) e^{(x-a)t} = e^{-at} m_X(t).[/tex]
We have
[tex]E\,(X-a)^k = \left. \left( \frac{\partial}{\partial t}\right)^k m_{X,a}(t)\right|_{t=0}.[/tex]
For the binomial B(n,p) we have [itex]m_{X,a}(t) = e^{-at} (1-p + pe^t)^n.[/itex] For 'a' different from the mean the computation of the kth moment gets complicated (but do-able) for k > 2.

RGV
 
Last edited:
Ray Vickson said:
It is a bit easier to use the moment-generating function (mgf) instead of the characteristic function. The mgf of any discrete random variable X is
[tex]m_X(t) = \sum_{x} p(x) e^{tx} = E\, e^{tX}.[/tex] To get moments of X about 'a', you need to find [itex]E(X-a)^k[/itex], and the easiest way is to use
[tex]m_{X,a} (t) \equiv E\, e^{(X-a)t} = \sum_{x} p(x) e^{(x-a)t} = e^{-at} m_X(t).[/tex]
We have
[tex]E\,(X-a)^k = \left. \left( \frac{\partial}{\partial t}\right)^k m_{X,a}(t)\right|_{t=0}.[/tex]
For the binomial B(n,p) we have [itex]m_{X,a}(t) = e^{-at} (1-p + pe^t)^n.[/itex] For 'a' different from the mean the computation of the kth moment gets complicated (but do-able) for k > 2.

RGV
I've had my second class in this course, the professor helped us a bit for this exercise.
He uses a different notation than in this thread; to get the first moment I think he chose your way (let me know).
From the characteristic function [itex]\phi _X (k)=(pe^{ik}+1-p)^N[/itex], one gets the moments via the formula [itex]<X^n>=\frac{1}{i^n} \frac{d^n}{dk^n} \phi _X (k) \big | _{k=0}[/itex]. (*)
Where in my case [itex]X=n_1[/itex] where [itex]P_N(n_1)=\frac{N!}{n_1!(N-n_1)!}p^{n_1} (1-p)^{N-1}[/itex]. So in a way I think he expect us to use the formula (*) where our "a" would be 0. Is there a link with the moment generating function you suggested to use?
I'm wondering if I can get a general expression for the nth derivative of the characteristic function. I'll work on this.
 
fluidistic said:
I've had my second class in this course, the professor helped us a bit for this exercise.
He uses a different notation than in this thread; to get the first moment I think he chose your way (let me know).
From the characteristic function [itex]\phi _X (k)=(pe^{ik}+1-p)^N[/itex], one gets the moments via the formula [itex]<X^n>=\frac{1}{i^n} \frac{d^n}{dk^n} \phi _X (k) \big | _{k=0}[/itex]. (*)
Where in my case [itex]X=n_1[/itex] where [itex]P_N(n_1)=\frac{N!}{n_1!(N-n_1)!}p^{n_1} (1-p)^{N-1}[/itex]. So in a way I think he expect us to use the formula (*) where our "a" would be 0. Is there a link with the moment generating function you suggested to use?
I'm wondering if I can get a general expression for the nth derivative of the characteristic function. I'll work on this.

That formula is the same one I use, but in a different variable (k instead of t). The reason I use the moment-generating function instead of the characteristic function is just to avoid the annoying factor 1/i^k in front. As to a link: just Google "moment generating function", which turns up numerous articles.

RGV
 
Some news about this. I've found the first 2 moments (I used the formula in my previous post) and I'd like to find the first 2 cumulants.
What I've done so far is [itex]N \ln (pe^{ik}+1-p)=\sum _{n=1}^{\infty } C_n \frac{(ik)^n}{n!}[/itex]. I'm stuck here. I know I want [itex]C_1[/itex] and [itex]C_2[/itex] but I don't know how to proceed; at all. Any tip is welcome.
 

Similar threads

Replies
2
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
Replies
1
Views
1K