Characteristic function of binomial distribution.

Click For Summary
The discussion focuses on the characteristic function of a scaled binomial distribution, specifically X = (1/n)B(n,p). The computed characteristic function is φ_X(θ) = (1-p + pe^{iθ/n})^n, and its limit as n approaches infinity is e^{ipθ}. This result suggests that the distribution converges to a delta function centered at p, indicating that as the number of trials increases, the average outcome will approach p with probability 1. The interpretation emphasizes the law of large numbers, confirming that with an infinite number of trials, the average will converge to the expected probability. The conversation concludes with a clarification on the importance of discussing limits in mathematical contexts.
mnb96
Messages
711
Reaction score
5
Hello,
I considered a Binomial distribution B(n,p), and a discrete random variable X=\frac{1}{n}B(n,p). I tried to compute the characteristic function of X and got the following:

\phi_X(\theta)=E[e^{i\frac{\theta}{n}X}]=(1-p+pe^{i\theta/n})^n

I tried to compute the limit for n\to +\infty and I got the following result:

\lim_{n\to\infty}\phi_X(\theta)=e^{ip\theta}

How should I interpret this result?
That characteristic function would correspond to a delta-function distribution centered at -p. It doesn't make much sense to me.
 
Last edited:
Physics news on Phys.org
Except for the sign (it should be a delta function at p) it makes sense. By the law of large numbers the average of a sequence of random binomials will converge to p.
 
Ok thanks!
Now it's more clear.

If I got it right, that basically means that if I pick up an infinite amount of coins (head=1, tails=0, with probabilities p and (1-p)), throw them all at once, and finally sum up the result and divide by the number of coins, I should obtain p with probability 1.
 
mnb96 said:
Ok thanks!
Now it's more clear.

If I got it right, that basically means that if I pick up an infinite amount of coins (head=1, tails=0, with probabilities p and (1-p)), throw them all at once, and finally sum up the result and divide by the number of coins, I should obtain p with probability 1.
You've got the point, although mathematical precision means you talk about a limit.
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K