1. The problem statement, all variables and given/known data I have a question that looks so stupid that I have never dared to ask. If I want to measure the time average from t=0s to t=1s of a given f(t), the solution is compute the following integral: TA = 1/T*∫F(t)dt However, I have some doubts about this calculus. 2. Relevant equations 1) An integral computes an area under the curve. I am not really interested in computing an area, I just want the summation of the f(T) values along the interval. So, what does an area do here? 2) The integral sum up a lot of values which depend on the step size h. Assuming that we selected h=0.0001 (very close to 0), the total number of summations is 1/0.0001=10000. However, you only normalize the integral by 1 (1/T). Why is that? In a discrete probability distribution, I would have normalized by the number of added values.