# Homework Help: Derivative of Dot Product via Product Rule, commutative?

1. Jan 8, 2016

### Ocata

1. The problem statement, all variables and given/known data

Basically, I'm looking at the property that says if the magnitude of a vector valued function is constant, then the vector function dotted with it's derivative will be zero. But I'm stuck towards the end because the proof I found online seems to skip a step that I'm not certain about.

r(t) = <x(t),y(t) >

r(t) ⋅ r(t) = <x(t),y(t) > ⋅ <x(t),y(t) >
= $x(t)^{2} + y(t)^{2} = \sqrt{x(t)^{2} + y(t)^{2}}^{2}$
= $ll<x(t),y(t)>ll^{2} = ll r(t)ll^{2}$ = constant (suppose)

that is,
r(t) ⋅ r(t) = constant

Then, the derivative of r(t) ⋅ r(t):

$\frac{d}{dt}r(t) ⋅ r(t) = \frac{d}{dt} c$

r'(t) ⋅ r(t) + r(t) ⋅ r'(t) = 0

After this is where I'm stuck.

The proofs I've seen online then say:

2r(t) ⋅ r'(t) = 0

r(t) ⋅ r'(t) = 0

But, algebraically, can this only be true if:

r'(t) ⋅ r(t) + r(t) ⋅ r'(t) = r'(t) ⋅ r(t) + r'(t) ⋅ r(t) ?

I know that the dot product is commutative such that:

v ⋅ u = u ⋅ v

But, does the commutative property for the dot product extend to the product rule for dot product of vector valued functions?

2. Relevant equations

v ⋅ u = u ⋅ v

3. The attempt at a solution

since r'(t) = < x'(t),y'(t)>

then if commutative property is true, then:

r'(t) ⋅ r(t) + r(t) ⋅ r'(t) = r'(t) ⋅ r(t) + r'(t) ⋅ r(t)

The only way I can believe this to be true is if I prove it some how.

I will try to break it down to components and rearrange the terms in green so that that they resemble the portion in blue. Not sure if it will work, but I'll give it a try..

<x'(t),y'(t) > ⋅ <x(t),y(t) > + <x(t),y(t) > ⋅ <x'(t),y'(t)> = <x'(t),y'(t) > ⋅ <x(t),y(t) > + <x'(t),y'(t) > ⋅ <x(t),y(t) >

(x'(t)x(t) + y'(t)y(t)) + (x(t)x'(t) + y(t)y'(t)) = (x'(t)x(t) + y'(t)y(t)) + (x'(t)x(t) + y'(t)y(t))

Now, I suppose it is reasonable that: x(t)x'(t) = x'(t)x(t)

and so, x(t)x'(t) + y(t)y'(t) = x'(t)x(t) + y'(t)y(t)

Then,

(x'(t)x(t) + y'(t)y(t)) + (x(t)x'(t) + y(t)y'(t)) = (x'(t)x(t) + y'(t)y(t)) + (x'(t)x(t) + y'(t)y(t))

can be written as:

(x'(t)x(t) + y'(t)y(t)) + (x(t)x'(t) + y(t)y'(t)) = (x'(t)x(t) + y'(t)y(t)) + (x(t)x'(t) + y(t)y'(t))

And thus:

<x'(t),y'(t) > ⋅ <x(t),y(t) > + <x(t),y(t) > ⋅ <x'(t),y'(t)> = <x'(t),y'(t) > ⋅ <x(t),y(t) > + <x(t),y(t) > ⋅ <x'(t),y'(t)>

r'(t) ⋅ r(t) + r(t) ⋅ r'(t) = r'(t) ⋅ r(t) + r(t) ⋅ r'(t)

In Conclusion, since dot product is commutative by itself, it is also true that it is commutative when the dot product exists within larger statement such as a product rule, that is:

r'(t) ⋅ r(t) + r(t) ⋅ r'(t) = r'(t) ⋅ r(t) + r'(t) ⋅ r(t)

And therefore,

It can be written that

r'(t) ⋅ r(t) + r(t) ⋅ r'(t) = r'(t) ⋅ r(t) + r'(t) ⋅ r(t) = 2(r'(t) ⋅ r(t))

Is this correct reasoning?

Last edited: Jan 8, 2016
2. Jan 8, 2016

### Staff: Mentor

You have two vectors r' and r and you know the dot product is commutative hence you have $2r'.r = 0$ which means that either r or r' are zero vectors or the angle between them is $\pi/2$ and hence they are perpendicular.

3. Jan 8, 2016

### Ray Vickson

You say that you believe $\vec{u} \cdot \vec{v} = \vec{v} \cdot \vec{u}$---which is true. What is preventing you from putting $\vec{u} = \vec{r}(t)$ and $\vec{v} = \vec{r}'(t)\:$?

4. Jan 8, 2016

### vela

Staff Emeritus
How do you know that $\vec{v}\cdot\vec{u} = \vec{u}\cdot\vec{v}$? If you understand how to prove that, the answer to your question should be clear.

You don't need to suppose anything. You should be know the reason why this is true.

5. Jan 9, 2016

### Ocata

Jedishrfu, Ray Vickson, and Vela,

Thank you.