# Differentiating x^2

1. Oct 25, 2004

### spacetime

If you differentiate $$x^2$$, you get $$2x$$. But now, if you right $$x^2$$ as $$x+x+x+x+x+x.....$$ x times, and then differentiate, you get $$1+1+1+1+1+... = x$$

What's wrong. Is it the discontinuity arising from the fact that multiplication can be converted to addition only in integers.

2. Oct 25, 2004

### matt grime

x^2 isn't gotten by writing x out x times, something that doesn't even begin to make sense in R, as you seemingly well know.

3. Oct 25, 2004

### T@P

I think matt grime is right. If you draw a graph of x^2 where the domain is integers you would get a bunch of points, so the derivative would be meaningless over a bunch of points. Is that right...?

4. Oct 25, 2004

### MiGUi

The function has to be continuous to have derivative, and the derivative gives local information of the function. A bunch of points has not derivative.

5. Oct 25, 2004

### JasonRox

Differentiability implies Continuity

When you differentiate something, you are assuming that the function is continuous.

6. Oct 25, 2004

### matt grime

You do not assume continuity per se. I can show that x^2 has a derivative without assuming it is continuous.

I can also show that if it is differentiable at a point it is continuous at that point, but that doesn't mean I am assuming continuity at that point all, indeed assuming continuity doesn't help in proving a derivative exists.

7. Oct 25, 2004

### JasonRox

That is true. That is why the converse is not true.

"Continuity does not always imply Differentiability"

If it is differiential at a, then it is also continuous at point a. This is what I said.

I would like to see a discontinuous function that is differentiable at the point of discontinuity.

8. Oct 25, 2004

### matt grime

I think this is a semantic preference of mine. I have no need to *assume* f is continuous at a point when I differentiate it there since I know it to be continuous (if it is differentiable).

And indeed, practically, assuming something is continuous is often no help when attempting to show it is differentiable there.
Suppose I take some function f, and *assume* continuity, now, how does that help me show differentiability or lack of? If I can show it to be discontinuous then I can deduce it is not differentiable at that point, but that isn't in my opinion the same.

Example, let f be any continuous function from R to R, and set g(x) = int_0^x f(t)dt, then I can show g is differentiable without *assuming* it is continuous.

Here is a good example of using an *assumption* mathematically.

Suppose I wish to prove something about a category that is true up to equivalence, then I may assume that the category is skeletally small. Or more simply, suppose I wish to prove something about a set of cardinality n that is only dependent on cardinality, then I may *assume* that the set is {1,2,3..,n}.

Assumptions are about sufficiency, not necessity.

9. Oct 25, 2004

### JasonRox

Hmm... you obviously know more math than I do, but I'll stick with what I know right now.

Everyone just finds the derivative without knowing whether or not it is continuous, but if it is differentiable, then is must be continuous.

The left and right hand limits wouldn't equal if it weren't continous.

It is like having like below:

-----O
.......-------
0 1 2 3

What is the rate of change when x=3? Clearly that is not possible. For now, anyways.

10. Oct 26, 2004

### Zurtex

Here is a good one that can make you think about the problem a bit more:

$$\frac{d}{dx}(x) = 1$$

Where $$x \, \epsilon \, \mathbb{Q}$$

Therefore:

$$x = \frac{p}{q} \quad p \, \epsilon \, \mathbb{N} \quad q \, \epsilon \, \mathbb{Z} \quad q \neq 0$$

We can say that:

$$x = \frac{1}{q} + \frac{1}{q} + \frac{1}{q} + \ldots = \sum_{r=1}^p \frac{1}{q}$$

$$\frac{d}{dx}(x) = 0 + 0 + 0 + \ldots = 0$$

I remember coming up with a better and clearer explanation of the fallacy but an obvious one is that your only looking at a single point which obviously means your derivative is going to be different from looking at the derivative of a function in general.

11. Oct 26, 2004

### matt grime

reminds me when you accused me (and others) of never having seen i as the square root of minus 1. (I didn't hand back any of my degrees, though, you'll be pleased to hear.)

that isn't assuming anything is it, it's showing something to be true or it's knowing it to be true.

that doesn't make much sense.

The formal derivative of many functions defined on spaces without a topology (ie no notion of continuity) are useful, eg in galois theory.

12. Oct 26, 2004

### JasonRox

That was an honest mistake.

If are to stick with the basic calculus that I know, am I right on that?

Obviously I am not on the same track as you, but I know this is right.

If a function is differentiable in the interval (a,b) then it is continuous in the interval [a,b].

We know this is true. Well, for 1st year students anyways.

Of course we can find the derivatives of the entire interval of [a,b], how useful is it? The rate of change at b might be x, but then the number right after b might be discontinuous.

How useful is it?

Here is my lame graph:

--------*..........
........................
...........o--------
0 a b

Now if you look at the Froeemen Theorem, which has been proven true, we know that this is impossible.

I'M SO TIRED I WANT TO SLEEP ON THE FLOOR, IN THE CLASSROOM!

Note: You are going beyond me on this. The Froeemen Theorem is a joke.

13. Oct 26, 2004

### matt grime

Whoever told you that is lying through their teeth

1/x(x-1) is differentiable on the interval (0,1) but certainly not continuous on [0,1]. I picked that for the obvious reason that the function isn't even defined on the interval [0,1] as it stands.

Are you cocking up your brackets?

14. Oct 26, 2004

### JasonRox

Did I say I was tired?

I :zzz: for 2 hours this afternoon, and now I feel .

I see what you are saying.

...I should shut up now...

15. Oct 26, 2004

### JasonRox

You know what, I am not going to shut up.

I want to get any possible errors out of my head.

If a function is differentiable at a, then it is also continuous at a.

Isn't that true though?

You're talking about differentiating the function at a with no notion of whether or not it is continuous. This probably needs a different set of numbers for this to be possible. I said "probably". For now, in R^2 with Real Numbers, I can't grasp that right now.

16. Oct 26, 2004

### Manchot

I think that it suffers from the same fallacy that the topic poster's curiosity arises from. Even when you consider rational numbers, you have a discontinuous graph. It is my understanding that the set of real numbers is denser than the set of rational numbers, leading to this conclusion. Interesting...

17. Oct 26, 2004

### robphy

I could be wrong... but I'll speak up anyway.

No one seems to have addressed the issue that the number of terms in this sum depends on x as well. So, it would seem to me that it is not sufficient to take only the derivative of the addends. Now, of course, the "number of terms" is an integer... and will lead to some complications in taking its derivative.

18. Oct 27, 2004

### Zurtex

I made it to suffer from the same fallacy, but aren't rational numbers still continuous? I mean take any two points, A and B, on a rational number line and they have a mid point M. M and A have a midpoint and M and B have a midpoint and so on and so forth.

19. Oct 27, 2004

### matt grime

Manchot:

Speaking of some set S being dense implies that S<T and S is dense in T. Then the real numbers as a subset of the reals are exactly as dense as the rationals as a subset of the rationals. And if that weren't true you'd be a little buggered since the Reals are defined (in one way) of being the completion of the rationals.

Zurtex, what do you mean by "the rationals are continuous"? A function can be continuous, but a subset of R?

20. Oct 27, 2004

### matt grime

No, I am not talking about differentiating without the notion of continuity, I am saying that if you look at the definition of differentiable at x nowhere do we *assume* it is continuous there. If it is differentiable then it is continuous there, however the assumption of continuity does not help prove it is differentiable. Take |x|, not only can I assume that is continuous I can prove it is continuous, but the assumption does not help you in proving it is or isn't differentiable at 0. I am taking issue with your use of the word 'assume', ok, not the analysis, which you get right most of the time.

I realize you don't have a high opinion of my mathematics, but try and accept it please. It's about your use of the word assume.