# A full proof of Berry-Esseen inequality in the Central Limit Theorem

**[Aim of this article: I will provide you a full proof of the Berry-Esseen theorem which I successfully proved it after investing two hours. This theorem gives us the maximum convergence limit of the basin of attraction in the Central limit theorem.]**

**[Note: I want to note here from the very beginning, this post is a bit technical. But I’m hoping that this will very helpful who is very needy of it. This proof is based on the book by W. Feller, “An Introduction to Probability and it’s application” and is only for identically independent distributed summands. Thus, I won’t prove the non-identical case because this post is a way longer. Please find it’s proof in Feller’s book.]**

Central limit theorem concern with the situation that the limit distribution of the normalized sum is normal as the sample size goes to infinity. But the question you may raise is, “*What is the rate of convergence of normalized sum distribution to the standard normal distribution?*“. Let’s answer this question by considering the case where the samples to be identical. To be more precise, let’s state like this:

*Let be independent variable with identical(or common) distribution such that, and, let stands for the distribution of the normalized sum . Then for all and , the supremum of convergence between and i.e. standard normal distribution is .*

Looks very boring! Right? Okay, let’s start with the history of the Central limit theorem(CLT).

The first proof of CLT was given by French mathematician Pierre-Simon Laplace in 1810. After fourteen years later, French mathematician Siméon-Denis Poisson improved it and provided us a more general form of proof. Laplace and his contemporaries were very interested in this theorem because they see the importance of it in repeated measurements of the same quantity. And thus they realized the individual measurements could be viewed as approximately independent and identically distributed, then their mean could be approximated by a normal distribution. Because this statistical plus probability theorem states that for a given sufficiently large sample size from a population with a finite level of variance, the mean of all samples from the same population will be approximately equal to the mean of the population regardless of the shape of the proposed distribution.

Then, the first convergence rate for CLT was estimated by Russian mathematician Aleksandr M. Lyapunov. But, the more refined version of proof is independently discovered by two mathematicians Andrew C. Berry (in 1941) and Carl-Gustav Esseen (in 1942), who then, continuously refined the convergence theorem of CLT and hence, given this theorem which is named as “Berry-Esseen theorem”. The best thing about this theorem is that it only considered first three moments.

Now, I think you are very eager to know about the proof of this theorem. Right? Let’s get started without any further delay!

From Feller’s book lemma 1 section 3.13 at Page 538, the upper bound between and is

where,

= characteristics function for which equals to ,

= characteristics function for which equals to ,

= maximum growth rate for such that .

The above expression can be found by starting with Fourier’s methods. And our proposed proof is based on smoothing inequality (refer Feller’s paper at section 3.5) such that,

.

The last inequality is the result of moment inequality. And, the normal density has maximum (I really don’t know why Feller chose this bound. I would be very happy if you guys could help me in this quest!).

So equation becomes,

Now, Let’s find ?

Isn’t it looks like the reverse triangle inequality with exponent ““? I mean this

if .

Thus, we can say

, if .

Again, let’s make our problem much simpler by proposing ?

First of all, let’s suppose so that . Thus, so that .

Look! How beautiful this looks like:

, putting the series of

, neglecting the higher order terms because for large then, .

The characteristics function for is

.

From the very first, I said as the sample size goes on increasing the shape of the curve of proposed distribution tends to match up with the normal curve. I mean this

Isn’t the smoothing concept looks like Taylor’s theorem? Exactly! Like Taylor theorem said, we can approximate any curve to a well-defined curve by a series expression. Likewise, we can estimate our proposed distribution with standard normal distribution by taking higher order terms. So, we will need to go like this

or, .

Now, multiply by and do integration both sides with the limit to i.e.

.

From characteristics property, the subtraction of two characteristics function gives another characteristics function and also we suppose, the result can be approximated by taking the higher order series. This is our trick:

, from equation .

Also, another inequality we can suppose is this:

.

For n = 3,

.

So, the equation becomes

.

In the left part of this inequality, we’re going to apply the Cauchy-Schwarz inequality as

.

Then, this will turn into

where the third part of the inequality has higher value than others.

For our need, we will use

, if , and second part is from Cauchy-Schwarz inequality

, applying the properties of Riemann integral in second part

, applying Cauchy-Schwarz inequality.

such that

.

Returning back the value of . we get,

.

Now, we conclude to smooth our proposed PDF. So that we can use . So, the equation becomes

, converting into exponential form with .

We know, the assertion of the theorem is trivially true for and hence we may assume .

We taking exponent both side in equation . We can get,

Thus, for n = 10,

.

Let me remind you equation with maximum equality i.e.

if

So, the right part of equation may serve for the bound i.e.

or,

or,

Thus, we can have

.

Also, we need formulate one more inequality. Let’s start from this:

for

.

Oh! I almost forgot. We need to construct something very useful. i.e.

, I have added two terms and applied triangle inequality.

First term Second term

which means,

First term = , from equation .

and,

Second term = , from equation .

Returning the above results in equation . we get,

.

Since , the above inequality should follow the integrand which means

, from equation

.

Also, we know . So,

.

From equation (13), let’s consider for then, thus,

and,

.

These above integrations can be done by using by-parts rule and also from Gamma function. But, I found a difficulty when solving on . If you guys solve it, please comment your solution. Thanks in advance!

So, equation becomes

.

For simplicity, we use below as final form:

**Feedback?**

If you guys have some questions, comments, or insults then, please don’t hesitate to shot me an email or comment below.

**Want to share this post**.