- #1
- 688
- 210
Hello!
Let ##I## be an interval of size ##L##, suppose we divide it into bits of ##dx## then ##L=dx+dx+... =\alpha.dx##
Since ##dx## is by definition infinitesimally small is it correct to imply that for each ##x## there's a corresponding ##dx## hence, here, ##\alpha## would be, theoretically, the total number of ##x##s?
Because, as I see it, that's how the average value of a function is calculated :
$$\frac{\sum{f(x)}}{\alpha} = \frac{1}{L}.\sum{f(x).dx} = \frac{1}{L}. \int f(x).dx$$
Let ##I## be an interval of size ##L##, suppose we divide it into bits of ##dx## then ##L=dx+dx+... =\alpha.dx##
Since ##dx## is by definition infinitesimally small is it correct to imply that for each ##x## there's a corresponding ##dx## hence, here, ##\alpha## would be, theoretically, the total number of ##x##s?
Because, as I see it, that's how the average value of a function is calculated :
$$\frac{\sum{f(x)}}{\alpha} = \frac{1}{L}.\sum{f(x).dx} = \frac{1}{L}. \int f(x).dx$$