Find the fallacy in the derivative.

  • Thread starter Thread starter Flexington
  • Start date Start date
  • Tags Tags
    Derivative
Flexington
Messages
17
Reaction score
0
Ok,
So i have a problem understanding conflicting results of a derivative,
Consider the derivative of x2, which is 2x.
However, if x2 is expressed as a sum of x's such that f(x) = x + x + x + x ... (x times), the derivative of f(x) becomes = 1 + 1 + 1 + 1 ... (x times.) = x Hence the derivation shows the derivative of x2 to be x.
Clearly this can't be correct. Where is the fallacy in this?

My idea is that the summation is linear in X whilst x2 is non linear hence the summation won't converge to x2. However this is only an idea?
 
Physics news on Phys.org
Well f is only defined if x is a natural number, so it's not well defined as a function from R to R. Also, we can't apply the summation rule if the number of terms is changing.
 
Thank you for the reply.

So the derivative of the sum of x's is invalid as x is undefined.i.e. the number of terms. Its time i brushed up on the rules of differentiation. Also you say the function is not well defined between R and R, can you explain what this means as i don't understand?
 
Writing x^2 as a "sum of x 'x's" requires that x be a positive integer, not a general real number.
 
Flexington said:
Thank you for the reply.

So the derivative of the sum of x's is invalid as x is undefined.i.e. the number of terms. Its time i brushed up on the rules of differentiation. Also you say the function is not well defined between R and R, can you explain what this means as i don't understand?

If a function f(x) is not well defined between R and R, there exists a real value x such that f(x) is not defined. For instance:

f(x) = 1/x is not defined for x = 0.
 
Back
Top