# Divergence of vector/tensor

Hi guys, trying to solve a problem in MHD, i realised i need to be able to take the divergence of this following integral, but I don't know how to do it.
M is a symmetric rank 2 tensor, r is a vector.

The integral is as follows
$$\int_{\partial V} (\textbf{r} d \textbf{S} \cdot \textbf{M}+d\textbf{S} \cdot \textbf{Mr})$$
I need to somehow manipulate this to get $$\int_V {\{\nabla \cdot \textbf{M})\textbf{r}+\textbf{r}(\nabla \cdot \textbf{M})+2\textbf{M}\}dV}$$

Thanks

## Answers and Replies

The divergence theorem states that the divergence of a tensor or vector field over a volume V is equivalent to a surface integral of the inner product of the tensor or vector field with the surface basis vectors. So split the integral into the sum of two integrals and use the divergence theorem to prove the equality.