- #1
Flexington
- 19
- 0
Ok,
So i have a problem understanding conflicting results of a derivative,
Consider the derivative of x2, which is 2x.
However, if x2 is expressed as a sum of x's such that f(x) = x + x + x + x ... (x times), the derivative of f(x) becomes = 1 + 1 + 1 + 1 ... (x times.) = x Hence the derivation shows the derivative of x2 to be x.
Clearly this can't be correct. Where is the fallacy in this?
My idea is that the summation is linear in X whilst x2 is non linear hence the summation won't converge to x2. However this is only an idea?
So i have a problem understanding conflicting results of a derivative,
Consider the derivative of x2, which is 2x.
However, if x2 is expressed as a sum of x's such that f(x) = x + x + x + x ... (x times), the derivative of f(x) becomes = 1 + 1 + 1 + 1 ... (x times.) = x Hence the derivation shows the derivative of x2 to be x.
Clearly this can't be correct. Where is the fallacy in this?
My idea is that the summation is linear in X whilst x2 is non linear hence the summation won't converge to x2. However this is only an idea?