# Vector analysis

1. Jun 4, 2010

### rafaelpol

1. The problem statement, all variables and given/known data

Verification of the product U . del (R) = U

Vectors are written in bold

2. Relevant equations

R = ix + jy + kz

3. The attempt at a solution

I cannot understand what is going on. If del R is the gradient vector of R, then the problem will be scalar product, whose answer is Ux + Uy + Uz, not a vector as it is written on the book. And if del R is the scalar product of del and R, then the answer is going to be a vector, but it will be 3U not only U.

2. Jun 4, 2010

### Office_Shredder

Staff Emeritus
Since R is a vector, you can't take the gradient of it. Other than that I agree with your analysis

3. Jun 4, 2010

### HallsofIvy

"del(R)" does not appear to me to be standard notation. Go back to the textbook or other source of this problem to determine what it means.

4. Jun 4, 2010

### rafaelpol

That is exactly the problem. After solving other exercises of the book I found that this is really a notation issue. In order to obtain the correct answer one has to do the dot product of vector U and del, and then operate with the resultant operator in the vector R. The answer is going to be vector U. Anyway, thank you very much for the answers.

5. Jun 4, 2010

### Rasalhague

Just ignore this if it's too much information; obviously it's not necessary to solve the question...

The notation del(R), where R is a vector field, is used in Davis/Snider: Introduction to Vector Analysis for a second order tensor or dyadic, thus in Cartesian coordinates:

$$\nabla \textbf{R}=$$

$$\frac{\partial R_1}{\partial x}\hat{\textbf{i}}\hat{\textbf{i}}+\frac{\partial R_1}{\partial y}\hat{\textbf{j}}\hat{\textbf{i}}+\frac{\partial R_1}{\partial z}\hat{\textbf{k}}\hat{\textbf{i}}$$

$$+\frac{\partial R_2}{\partial x}\hat{\textbf{i}}\hat{\textbf{j}}+\frac{\partial R_2}{\partial y}\hat{\textbf{j}}\hat{\textbf{j}}+\frac{\partial R_2}{\partial z}\hat{\textbf{k}}\hat{\textbf{j}}$$

$$+\frac{\partial R_3}{\partial x}\hat{\textbf{i}}\hat{\textbf{k}}+\frac{\partial R_3}{\partial y}\hat{\textbf{j}}\hat{\textbf{k}}+\frac{\partial R_3}{\partial z}\hat{\textbf{k}}\hat{\textbf{k}}$$

which we can write more briefly, using summation signs,

$$\nabla \textbf{R}=\sum_{p=1}^{n}\sum_{q=1}^{n}\frac{\partial R_p}{\partial x_q} \hat{\textbf{i}}_q \hat{\textbf{i}}_p$$

where x1=x, x2=y, x3=z, and i1=i, i2=j, i1=k.

The dot product of such an object with a vector is another vector, and this operation is, in general, not commutative; that is, the order matters. It's defined as follows:

$$\textbf{A}\textbf{B} \cdot \textbf{C} \equiv \textbf{A}(\textbf{B} \cdot \textbf{C})$$

$$\textbf{C} \cdot \textbf{A}\textbf{B} \equiv (\textbf{C} \cdot \textbf{A}) \textbf{B}$$

Another notation, not used by Davis & Snider, is

$$\textbf{A}\textbf{B} \equiv \textbf{A} \otimes \textbf{B}$$

Combining a pair of vectors to form a dyadic is a special case of the tensor product operation; the dyadic AB is the tensor product of A and B. If the vectors have n components, a tensor product of two of them will have n2 components. It can be convenient to represent these as an nxn matrix, then the dot product and tensor product can be treated as matrix multiplication of components:

$$\textbf{A}\cdot \textbf{B}=A^TB \enspace\enspace\enspace \textbf{A}\textbf{B}=AB^T$$

$$\textbf{C} \cdot \textbf{A}\textbf{B}=C^TAB^T \enspace\enspace\enspace \textbf{A}\textbf{B} \cdot \textbf{C}=AB^TC$$

From the above definitions, considering your example from this point of view gives the same answer:

$$(\textbf{U} \cdot \nabla) \textbf{R} = \textbf{U} \cdot (\nabla \textbf{R})$$

The ith component of this vector is

$$\sum_{k=1}^{n}U_k \frac{\partial R_i}{\partial x_k}$$

and in your example, all the partial derivatives in this equation come to 1, so resulting vector is equal to U.

More generally, I've seen del(T) used to mean the gradient of a tensor, T, of any order, the object which when dotted with a unit vector gives the directional derivative in along the direction indicated by the vector.

6. Jun 5, 2010

### rafaelpol

Thank you very much for the extra information (tensors are the next topic of my course).