What is derivative of a vector respect to another vector?

In summary, the book seems to imply that the derivative of a vector function of a vector with respect to a vector is a covector, but the notation used is not really clear.
  • #1
yungman
5,718
241
I am confused. I never seen derivative of a vector respect to another vector. When I go on the web, the article just show divergence, curl, gradient etc. But not derivative of a vector respect to another vector?

For example what is

[tex]\frac{d(\vec{x}-\vec{x_0})^2}{d \vec{x}} ?[/tex]

For [tex]\vec{x_0}[/tex] is a constant vector.


The book seems to imply:

[tex]\frac{d[(\vec{x}-\vec{x_0})^2]}{d \vec{x}} = 2(\vec{x}-\vec{x_0}) \frac{d \vec{x}}{d \vec{x}} = 2(\vec{x}-\vec{x_0}) [/tex]

I guess I don't know how to do a derivative like this. Can anyone help? I have looked through the multiple variable book and nothing like this. The variable is always scalar. The closest I seen is:

[tex]\int_C \vec{F} \cdot d\vec{r} \;=\; \int_C \vec{F} \cdot \hat{r}dr[/tex]

But this is not exactly what the book discribed.

The only one that is remotely close is Directional Derivative which I don't think so.
 
Last edited:
Physics news on Phys.org
  • #2
Wow, I agree that is really confusing notation.
Probably, they mean
[tex](\vec x - \vec x_0)^2 = (\vec x - \vec x_0) \cdot (\vec x - \vec x_0)[/tex]
so the square is actually a scalar.

Then in components, you could write
[tex]
\left( \frac{\mathrm d [(\vec x - \vec x_0)^2] }{ \mathrm d\vec x } \right)_j =
\frac{\mathrm d [(\vec x - \vec x_0)^2] }{ \mathrm d\vec x_j } =
2(\vec x - \vec x_0)_j = 2(\vec x_j - (\vec{x_0})_j) )
[/tex]

If you doubt this, you can write out
[tex](\vec x - \vec x_0) \cdot (\vec x - \vec x_0) = \left( \sum_{i = 1}^n (\vec x_i)^2 \right) + 2 \left( \sum_{i = 1}^n (\vec x_i) (\vec x_0)_i \right) + \left( \sum_{i = 1}^n ((\vec{x_0})_i)^2 \right)[/tex]
and use that
[tex]\frac{\mathrm d}{\mathrm d \vec x_j} \left( \sum_{i = 1}^n (\vec x_i)^2 \right) = 2 \vec x_j[/tex]
etc
 
  • #3
yungman said:
The book seems to imply ... But this is not exactly what the book discribed.

What book?
 
  • #4
The derivative of a vector function of a vector [tex]\vec f(\vec x)[/tex] with respect to a vector [tex]\vec x[/tex] is a 1-1 tensor, with the i,j element being

[tex]\frac{\partial f_i(\vec x)}{x_j}}[/tex]

However, [itex](\vec x - \vec x_0)^2[/itex] is not a vector function. It is a scalar. You are just calculating the gradient:

[tex]\nabla f(\vec x) = \sum_j \frac{\partial f(\vec x)}{x_j}}\hat x_j[/tex]

Note that the gradient looks a lot like a vector. It is better thought of as being a covector.

So what about the gradient of [itex]f(\vec x) = (\vec x - \vec x_0)^2[/itex]? Expanding this, we get

[tex]f(\vec x) = (\vec x - \vec x_0)\cdot (\vec x - \vec x_0) = \sum_i (x_i - x_{0,i})^2[/tex]

Taking the gradient, the jth of the gradient is

[tex]\left(\nabla f(\vec x)\right)_j = \sum_i 2 (x_i - x_{0,i}) \frac{\partial x_i}{\partial x_j}
= \sum_i 2 (x_i - x_{0,i})\delta_{ij} = 2(x_j - x_{0,j})[/tex]
 
  • #5
The book is PDE by Strauss p194 to p195. It is part of the derivation of the Green's Function for sphere. The part is about normal derivative of G. It talked about derivation respect to [itex]\vec{x}[/itex] and some very funcky statement I still don't understand. But the later part just went back to the ordinary definition of normal derivative:

[tex]\frac{\partial G}{\partial n} = \nabla G \cdot \hat{n}[/tex]

and derive the equation accordinary as if nothing happened! So it is a non question at this point. Strauss is not a good book in any stretch. I just cannot find any PDE book that cover the Green's Function and the EM book that I ordered is still in shipment!

Thanks

Alan
 
  • #6
amazon.com's search function let's me look at some, but not all, of the pages in the book. Do you mean the statement
Let's not forget that [itex]\bold{x}_0[/itex] is considered to be fixed, and the derivatives are with respect [itex]\bold{x}[/itex].

on page 185?

Notice that equation (10) on page 185 gives [itex]G[/itex] as a function of both [itex]\bold{x}[/itex] and [itex]\bold{x}_0[/itex], so the quoted statement just means that normal partial derivatives, gradients, divergences, etc., are with respect to the coordinates of [itex]\bold{x}[/itex] and not with respect to the coordinates of [itex]\bold{x}_0[/itex]. The quoted statement does not actually mean "take the derivative with respect to a vector."
 
  • #7
George Jones said:
amazon.com's search function let's me look at some, but not all, of the pages in the book. Do you mean the statement


on page 185?

Notice that equation (10) on page 185 gives [itex]G[/itex] as a function of both [itex]\bold{x}[/itex] and [itex]\bold{x}_0[/itex], so the quoted statement just means that normal partial derivatives, gradients, divergences, etc., are with respect to the coordinates of [itex]\bold{x}[/itex] and not with respect to the coordinates of [itex]\bold{x}_0[/itex]. The quoted statement does not actually mean "take the derivative with respect to a vector."

Yes, that is the sentence I refer to. I just took it literally that derivative with respect to [itex]\vec{x}[/itex]. I have absolutely no issue on the normal derivative. That is the reason I reposted that I have no question on the complete derivation.

I have not gone into the exercise yet. I still have one more question regarding to a zero vector in my other post that I got stuck! If you can help, that would be really appreciated.

Thanks.
 

What is the definition of derivative of a vector respect to another vector?

The derivative of a vector with respect to another vector is a mathematical concept that measures how a vector quantity changes with respect to changes in another vector quantity. It is used to calculate the rate of change or slope of a vector function.

How is the derivative of a vector respect to another vector calculated?

The derivative of a vector with respect to another vector is calculated using the partial derivative notation, which involves taking the limit of the change in the vector divided by the change in the other vector as the change in the other vector approaches zero. This can also be represented using the gradient operator.

What is the relationship between the derivative of a vector respect to another vector and the dot product?

The derivative of a vector with respect to another vector is closely related to the dot product. In fact, the derivative of a vector function can be expressed as the dot product of the gradient of the function with the derivative of the vector with respect to the other vector.

What are the applications of the derivative of a vector respect to another vector?

The derivative of a vector with respect to another vector has many applications in physics, engineering, and other fields. It is used to calculate motion and velocity, forces and acceleration, and to optimize functions in multivariable calculus.

What are some common misconceptions about the derivative of a vector respect to another vector?

One common misconception is that the derivative of a vector with respect to another vector is a vector itself. In reality, it is a scalar value. Another misconception is that the derivative is always constant, when in fact it can vary depending on the values of the vectors involved.

Similar threads

Replies
6
Views
928
  • Calculus
Replies
6
Views
999
Replies
2
Views
1K
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
770
Replies
2
Views
2K
Replies
2
Views
298
Replies
8
Views
2K
Back
Top