Just ignore this if it's too much information; obviously it's not necessary to solve the question...
The notation del(R), where R is a vector field, is used in Davis/Snider: Introduction to Vector Analysis for a second order tensor or dyadic, thus in Cartesian coordinates:
\nabla \textbf{R}=
\frac{\partial R_1}{\partial x}\hat{\textbf{i}}\hat{\textbf{i}}+\frac{\partial R_1}{\partial y}\hat{\textbf{j}}\hat{\textbf{i}}+\frac{\partial R_1}{\partial z}\hat{\textbf{k}}\hat{\textbf{i}}
+\frac{\partial R_2}{\partial x}\hat{\textbf{i}}\hat{\textbf{j}}+\frac{\partial R_2}{\partial y}\hat{\textbf{j}}\hat{\textbf{j}}+\frac{\partial R_2}{\partial z}\hat{\textbf{k}}\hat{\textbf{j}}
+\frac{\partial R_3}{\partial x}\hat{\textbf{i}}\hat{\textbf{k}}+\frac{\partial R_3}{\partial y}\hat{\textbf{j}}\hat{\textbf{k}}+\frac{\partial R_3}{\partial z}\hat{\textbf{k}}\hat{\textbf{k}}
which we can write more briefly, using summation signs,
\nabla \textbf{R}=\sum_{p=1}^{n}\sum_{q=1}^{n}\frac{\partial R_p}{\partial x_q} \hat{\textbf{i}}_q \hat{\textbf{i}}_p
where x1=x, x2=y, x3=z, and i1=i, i2=j, i1=k.
The dot product of such an object with a vector is another vector, and this operation is, in general, not commutative; that is, the order matters. It's defined as follows:
\textbf{A}\textbf{B} \cdot \textbf{C} \equiv \textbf{A}(\textbf{B} \cdot \textbf{C})
\textbf{C} \cdot \textbf{A}\textbf{B} \equiv (\textbf{C} \cdot \textbf{A}) \textbf{B}
Another notation, not used by Davis & Snider, is
\textbf{A}\textbf{B} \equiv \textbf{A} \otimes \textbf{B}
Combining a pair of vectors to form a dyadic is a special case of the tensor product operation; the dyadic AB is the tensor product of A and B. If the vectors have n components, a tensor product of two of them will have n2 components. It can be convenient to represent these as an nxn matrix, then the dot product and tensor product can be treated as matrix multiplication of components:
\textbf{A}\cdot \textbf{B}=A^TB \enspace\enspace\enspace \textbf{A}\textbf{B}=AB^T
\textbf{C} \cdot \textbf{A}\textbf{B}=C^TAB^T \enspace\enspace\enspace \textbf{A}\textbf{B} \cdot \textbf{C}=AB^TC
From the above definitions, considering your example from this point of view gives the same answer:
(\textbf{U} \cdot \nabla) \textbf{R} = \textbf{U} \cdot (\nabla \textbf{R})
The ith component of this vector is
\sum_{k=1}^{n}U_k \frac{\partial R_i}{\partial x_k}
and in your example, all the partial derivatives in this equation come to 1, so resulting vector is equal to U.
More generally, I've seen del(T) used to mean the gradient of a tensor, T, of any order, the object which when dotted with a unit vector gives the directional derivative in along the direction indicated by the vector.