#### f95toli

Science Advisor

Gold Member

- 2,932

- 425

My "in" data is a times series wth n samples sampled at some frequency fs which is then post-processed in Matlab. If often deal with quite long time-series (millions of points) that take hours or days to aquire, and I am therefore interested in understanding how much here really is to gain my say doubling the number of aquired points.

My question is a practical one: how many samples do I need in order to estimate the shape of the distribution?

I know that the accuracy by which I can estimate the mean improves as √n, at least if one assumes a normal distribution.

But how quickly does the estimate of the std improve?

Also, can one say something about many samples one need to estimate the parameters for other common distributions (Poissonian etc)?