# Find the fallacy in the derivative.

Ok,
So i have a problem understanding conflicting results of a derivative,
Consider the derivative of x2, which is 2x.
However, if x2 is expressed as a sum of x's such that f(x) = x + x + x + x .... (x times), the derivative of f(x) becomes = 1 + 1 + 1 + 1 .... (x times.) = x Hence the derivation shows the derivative of x2 to be x.
Clearly this cant be correct. Where is the fallacy in this?

My idea is that the summation is linear in X whilst x2 is non linear hence the summation wont converge to x2. However this is only an idea?

Well f is only defined if x is a natural number, so it's not well defined as a function from R to R. Also, we can't apply the summation rule if the number of terms is changing.

So the derivative of the sum of x's is invalid as x is undefined.i.e. the number of terms. Its time i brushed up on the rules of differentiation. Also you say the function is not well defined between R and R, can you explain what this means as i dont understand?

HallsofIvy
Homework Helper
Writing x^2 as a "sum of x 'x's" requires that x be a positive integer, not a general real number.

gb7nash
Homework Helper