# Why is the autocorrelation equal to the expected value?

## Homework Statement

This isn't really a homework question as much as something that I just couldn't figure out.
I just noticed in an exam I was working on that they at one point converted the expected value of a signal multipled by itself to the crosscorelation of the signal at l = 0, and a quick google shows that this is indeed correct, but I'm having some problems understanding why, consider the following.

Let's say we have a discrete signal x[n] given by [0 4 8 12].

The autocorrelation at l = 0 would be defined as:

$$\sum_{n=0} x(n)*x(n)$$

From minus infinity to infinity, or in our case from 0 to 4, which would give:

0*0 + 4*4 + 8*8 + 12*12 = 224.

Surely, the expected value of x(n)^2 isn't 224? It may just be me misenterpreting what the expected value of a digital signal is, though. But you'd think it is the mean or something. I'm not entirely sure, but some help in understanding why exactly the autocorrelation equals the expected value of the digital signal would be nice.

It seems more to me like the autocorrelation of the signal gives the total signal power, not expected value.