# Easy derivative but with a pesky singularity

• B
• snoopies622

#### snoopies622

TL;DR Summary
How to remove the singularity from the derivative of s^2=x^2+y^2
Setting: a plane with the standard Cartesian coordinate system. A particle is constrained to the x axis, with position x and moving at speed x dot. Another particle is constrained to the y axis, with position y and moving at speed y dot. The distance between them at any moment is s. It is easy to show that this distance changes at rate

$\dot {s} = (1/s)(x \dot{x} + y \dot{y} )$

which seems fine except for the singularity when both particles are simultaneously at the origin. In that special case, with geometric reasoning one can arrive at

$\dot{s}^2 = \dot {x} ^2 + \dot {y} ^2$

But how do I remove the singularity from my first s dot equation to arrive at an equivalent form that doesn't contain the singularity (and, I assume, simplifies to the second equation in the special x=0, y=0 case)?

TL;DR Summary: How to remove the singularity from the derivative of s^2=x^2+y^2

Setting: a plane with the standard Cartesian coordinate system. A particle is constrained to the x axis, with position x and moving at speed x dot. Another particle is constrained to the y axis, with position y and moving at speed y dot. The distance between them at any moment is s. It is easy to show that this distance changes at rate

$\dot {s} = (1/s)(x \dot{x} + y \dot{y} )$
To get to the equation above, you're assuming that ##s \ne 0##, and hence that ##x \ne 0## and ##y \ne 0##. Except at the origin, the equation above is equivalent to ##s\dot s = x\dot x + y\dot y## -- no singularity.
snoopies622 said:
which seems fine except for the singularity when both particles are simultaneously at the origin. In that special case, with geometric reasoning one can arrive at

$\dot{s}^2 = \dot {x} ^2 + \dot {y} ^2$

But how do I remove the singularity from my first s dot equation to arrive at an equivalent form that doesn't contain the singularity (and, I assume, simplifies to the second equation in the special x=0, y=0 case)?

you're assuming that ##s \ne 0##
I'm only assuming that $s^2 = x^2 + y^2$. I don't know why s=0 should be an exception to this relationship.

I'm only assuming that $s^2 = x^2 + y^2$. I don't know why s=0 should be an exception to this relationship.
It's not an exception. When both particles are at the origin, the distance between them is 0, and otherwise, the square of the distance between them is ##x^2 + y^2##.

Right, but you're the one who said I'm assuming that s does not equal zero.

Right, but you're the one who said I'm assuming that s does not equal zero.
No, I'm not assuming anything. What I'm saying is that with your given information, ##s^2 = x^2 + y^2##. If you differentiate with respect to t, you get ##s\dot s = x\dot x + y \dot y##. That equation is defined for all real values of x and y, so there is no singularity. When you divided both sides by s, you are tacitly assuming that ##s \ne 0##, but that can happen when both particles are at the origin. So again, no singularity.

I understand. My question is, since $s\dot s = x\dot x + y \dot y$ won't tell us $\dot s$ when s=0, how can we arrive at an equivalent formula that does? Can we mathematically manipulate $s\dot s = x\dot x + y \dot y$ in some way to get there?

When we divide both sides by s, then we have that singularity at s=0. But it seems to me that this is not an asymptotic situation, so that singularity should be removable. But I don't know how to do it.

I understand. My question is, since $s\dot s = x\dot x + y \dot y$ won't tell us $\dot s$ when s=0, how can we arrive at an equivalent formula that does?
I don't see how we can. All we know is that the particles are moving along the respective axes. We don't even know which direction they're going.
snoopies622 said:
Can we mathematically manipulate $s\dot s = x\dot x + y \dot y$ in some way to get there?
Not as far as I can tell.
snoopies622 said:
When we divide both sides by s, then we have that singularity at s=0. But it seems to me that this is not an asymptotic situation, so that singularity should be removable. But I don't know how to do it.
It seems to me that there isn't enough information given in the problem.

P.S. -- I find it simpler to use double # characters to surround inline TeX rather than typing itex and \itex plus the brackets.

Can't we exclude ##s=0## simply because two different particles can't occupy the same location?

• dextercioby and snoopies622
Or we can think of this as a limit problem, with the only difference from the limit problems I've been exposed to being that the limit involves two variables instead of one. That is, if some quantity z is defined by

$$z = \frac {x \dot{x} + y \dot{y} } { (x^2+y^2) ^{1/2}}$$

can we show that the limit of z as both x and y approach zero is $(\dot{x} ^2 + \dot {y}^2)^{1/2}$ ?

two different particles can't occupy the same location
Can't argue with that. Can't we exclude ##s=0## simply because two different particles can't occupy the same location?
Yes but it's avoiding the math problem presented. What if they could, like two laser dots hitting a plane? The physical problem should have a well defined rate of distance change there. It seems there ought to be an equation for it at that point.

No, I'm not assuming anything. What I'm saying is that with your given information, ##s^2 = x^2 + y^2##. If you differentiate with respect to t, you get ##s\dot s = x\dot x + y \dot y##. That equation is defined for all real values of x and y, so there is no singularity. When you divided both sides by s, you are tacitly assuming that ##s \ne 0##, but that can happen when both particles are at the origin. So again, no singularity.
Yes, but that's just a consequence of the form you chose for the expression. It doesn't address the underlying problem: If I tell you ##(x,y) = (0,0)## and also give you ##\dot{x(0)}## and ##\dot{y(0)}##, how do you solve for ##\dot{s}## in the same manner as you would for any other location? Why isn't the derivative well defined there?

It may no longer be a singular expression, but it's still not solvable.

Regarding entry #10, perhaps converting from Cartesian to polar coordinates might work, since then it would only be a matter of having one variable approach zero (r), while theta is arbitrary. Will investigate.

edit: oops, that's assuming that x and y here represent the same point instead of two different points. Nonetheless, i think the solution must lie in a coordinate transformation, maybe one more complicated.

Last edited:
Or, since

$ds = [(x + dx)^2 + (y+dy)^2]^{1/2} - [x^2 + y^2]^{1/2}$

the solution might lie in expanding a square root where there are six terms (or less) under the radical. Is this possible?

Regarding entry #10, perhaps converting from Cartesian to polar coordinates might work, since then it would only be a matter of having one variable approach zero (r), while theta is arbitrary. Will investigate.

edit: oops, that's assuming that x and y here represent the same point instead of two different points. Nonetheless, i think the solution must lie in a coordinate transformation, maybe one more complicated.
Yes, polar doesn't help. Since the args are constrained to 0 and π/2 all of the trig stuff drops out. You end up with this.
$$\dot{s} = \frac{r_x \dot{r_x}+r_y \dot{r_y}}{\sqrt{r_x^2 + r_y^2}}$$
As it should be, in retrospect.

Last edited:
The problem lies with Pythagoras and derivatives. Any distance function of the form ##s=\sqrt{f(x,y)}## will have ##\dot{s} = \frac{1}{2s} \dot{f(x,y)}##

If you treat everything as vectors you get this: Which is your original geometrical argument. It's not singular, but if you want to know ##\dot{s}## at (0,0) you have to figure out what ##\dot{x}## and ##\dot{y}## are there. Which should be simple DEs. Like parameterizing with time. You would have to do that anyway with the other formula to know what ##\dot{x}## and ##\dot{y}## values to enter at any point.

I'm stuck with the algebraic approach.

edit: Oops, typo: I left out the squared part ##\dot{s} = \sqrt{\dot{x}^2 + \dot{y}^2}##

Last edited:
L'Hopital's rule should work.

If we see it as a particle moving along over a cone, it seems we could just work on (0,t ]. At t=0, it hasn't begun moving?

L'Hopital's rule should work.
L'Hôpital's Rule works! If we start with $$\dot {s} = \frac {x \dot{x} + y \dot{y}}{s}$$

and take the time derivative of both the top and bottom of the fraction, we get

$$\dot {s}= \frac {\dot{x} \dot{x} + x \ddot{x} + \dot{y} \dot{y} + y \ddot{y}} {\dot {s}}$$

which, letting both x and y go to zero and multiplying both sides by $\dot{s}$

yields

$\dot{s} ^2 = \dot{x} ^2 + \dot{y} ^2$

as desired. Thanks, mathman!

(edit: i realize that that second equation isn't properly correct, i just haven't learned the LaTeX for limits yet)

Last edited:
TL;DR Summary: How to remove the singularity from the derivative of s^2=x^2+y^2

Setting: a plane with the standard Cartesian coordinate system. A particle is constrained to the x axis, with position x and moving at speed x dot. Another particle is constrained to the y axis, with position y and moving at speed y dot. The distance between them at any moment is s.

$s$ is also the distance from $(x,y)$ to the origin, so if $x = s \cos \phi$, $y = s \sin \phi$ we have essentially changed to polar coordinates. These have a coordinate singularity at the origin, so things can often look horrible there even if (as here) they look fine in Cartesians. As a case in point, consider $(x.y) = (t,t)$. No issues in Cartesians - the derivative at $t = 0$ is $(1,1)$ -, but in polars we have, for any choice of $c \in [0, 2\pi)$, $$(s,\phi) = \begin{cases} (\sqrt{2}t, \frac{\pi}{4}), & t > 0, \\ (0, c), & t = 0,\\ (-\sqrt{2}t, \frac{5\pi}{4}) & t < 0 \end{cases}$$ and neither $s(t)$ nor $\phi(t)$ are differentiable at $t = 0$.

• snoopies622 and DaveE
$s$ is also the distance from $(x,y)$ to the origin
Thanks pasmith, that puts this problem into a new light.

If we see it as a particle moving along over a cone, it seems we could just work on (0,t ]. At t=0, it hasn't begun moving?
In this way , we could extend by continuity:## (x_n \rightarrow 0) \rightarrow (f(x_n) \rightarrow f(0))##. Same for y.

Last edited:
If we see it as a particle moving along over a cone . .
I'm not clear on this. Are you suggesting adding a third dimension to the manifold?

I just noticed that $$\frac {x \dot{x} + y \dot{y} } {s} = \frac { <x,y>}{s} \cdot < \dot{x}, \dot{y} >$$

which, if we imagine this problem as finding the speed at which a particle located at (x,y) is moving relative to (0,0), is the inner product of its normalized position vector and its velocity vector, which makes sense. Neat.

Last edited: