Hey so I am in a second semester calculus class and we are learning about the average value of the function. We are presented the following formula: (1/(b-a)) ∫a to b f(x) dx for a ≤ x ≤ b This is basically doing (x1+...xn)/n, except for infinitely many numbers over the interval [a -b] Naturally we need to learn from some starting point, but in terms of averages of functions, this is where we end in the course. Now for the question. Aren't there better formulas for functions that are not evenly distributed? Say you were using this formula to get the average value of some function where most its values are some big big numbers, but the remaining minority of numbers are very tiny numbers, so much so that average value using the above method is brought to a value that is very misleading. Misleading as in you would never have guessed for example that an average value of 5 was produced by a function where lets say most of its values are in the thousands...and this happens because you have a small collection of very small numbers. If I'm not making sense here just let me know haha. I'll try to explain it another way. So are there other techniques of taking averages? Would these be in a statistics class?