Average Value of a Function

Main Question or Discussion Point

Hey so I am in a second semester calculus class and we are learning about the average value of the function. We are presented the following formula:

(1/(b-a)) ∫a to b f(x) dx for a ≤ x ≤ b

This is basically doing (x1+...xn)/n, except for infinitely many numbers over the interval
[a -b]

Naturally we need to learn from some starting point, but in terms of averages of functions, this is where we end in the course.

Now for the question. Aren't there better formulas for functions that are not evenly distributed?

Say you were using this formula to get the average value of some function where most its values are some big big numbers, but the remaining minority of numbers are very tiny numbers, so much so that average value using the above method is brought to a value that is very misleading. Misleading as in you would never have guessed for example that an average value of 5 was produced by a function where lets say most of its values are in the thousands...and this happens because you have a small collection of very small numbers.

If I'm not making sense here just let me know haha. I'll try to explain it another way.

So are there other techniques of taking averages? Would these be in a statistics class?

The next quantity you can define to characterize a function or data set would be something like "the average deviation from the average". The way you define it is as:

$$\frac{1}{b-a}\int_{a}^b f(x) (x - c) dx$$

where c is the average value of f(x), defined by your integral.

This expression is called the variance of the function f(x), and it's used very frequently in statistics.

You can go further and look at "average deviation from the average deviation" (skewness), and even higher order deviations (kurtosis). These things are called moments. If you know all the moments of a function (mean, variance, etc) you will be able to derive the function itself from these values.

AlephZero