Quoted from Wiki
Proof of the central limit theorem
For a theorem of such fundamental importance to statistics and applied probability, the central limit theorem has a remarkably simple proof using characteristic functions. It is similar to the proof of a (weak) law of large numbers. For any random variable, Y, with zero mean and unit variance (var(Y) = 1), the characteristic function of Y is, by Taylor's theorem,
\varphi_Y(t) = 1 - {t^2 \over 2} + o(t^2), \quad t \rightarrow 0
where o (t2 ) is "little o notation" for some function of t that goes to zero more rapidly than t2. Letting Yi be (Xi − μ)/σ, the standardized value of Xi, [it is easy to see] that the standardized mean of the observations X1, X2, ..., Xn is
Z_n = \frac{n\overline{X}_n-n\mu}{\sigma\sqrt{n}} = \sum_{i=1}^n {Y_i \over \sqrt{n}}.
By [simple properties] of characteristic functions, the characteristic function of Zn is
<br />
\left[\varphi_Y\left({t \over \sqrt{n}}\right)\right]^n = \left[ 1 - {t^2 \over 2n} + o\left({t^2 \over n}\right) \right]^n \, \rightarrow \, e^{-t^2/2}, \quad n \rightarrow \infty.<br />
[\latex]<br />
But, this limit is just the characteristic function of a standard normal distribution, N(0,1), and the central limit theorem follows from the Lévy continuity theorem, which confirms that the convergence of characteristic functions implies convergence in distribution.<br />
--------------------------------------------------------------------------------<br />
<a href="http://en.wikipedia.org/wiki/Central_limit_theorem" target="_blank" class="link link--external" rel="nofollow ugc noopener">http://en.wikipedia.org/wiki/Central_limit_theorem</a><br />
<br />
Look at the squared bracket, and see if they are trivial