Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Divergence of a tensor?

  1. Feb 28, 2006 #1
    So here's my problem. It may be very simple, but I don't know how to do it. Please help.
    Suppose [tex]\tau[/tex] is a 3x3 matrix with elements listed as (a b c; d e f; g h i). What would be the answer to [tex]\nabla\bullet\tau[/tex] be?


  2. jcsd
  3. Mar 4, 2006 #2
    Depends what you're contracting over. You can't really use the dot product on a matrix, because it's not a rank 1 tensor.

    [tex]\tau[/tex] has indices [tex]\tau_{ab}[/tex] or [tex]\tau^{a}_{\phantom{a}b}[/tex] or [tex]\tau^{ab}[/tex]. Similarly [tex]\nabla[/tex] is either [tex]\nabla_{c}[/tex] or [tex]\nabla^{c}[/tex].

    (Give or take a raising or lowering of an index using a metric) you'd define the divergence as something like [tex]\nabla^{a}\tau_{ab}[/tex]. That isn't the same as [tex]\nabla^{b}\tau_{ab}[/tex] unless [tex]\tau[/tex] is symmetric.

    You don't get this ambiguity when [tex]\tau[/tex] is a rank 1 tensor because [tex]\nabla . \tau = \nabla^{a}\tau_{a} = \nabla_{a}\tau^{a}[/tex].
  4. Mar 5, 2006 #3
    A nice way to think about this is to treat it as a matrix multiplying a vector. If you imagine multiplying a vector [tex]v[/tex] by [tex]\tau[/tex], you could write it as

    [tex]\tau \cdot v[/tex]

    So if we think of [tex]\nabla[/tex] as the 'vector' [tex]( d/dx, d/dy, d/dz )[/tex] then we just multiply that on the left of [tex]\tau[/tex] in the same way. I put the word 'vector' in inverted commas because what you've really got is a covector, or an element of the dual space - this is related to the fact that it appears on its side.

    After you'd done the multiplication, you'd end up with another covector (we can just pretend it's the same thing as a vector) which looks like

    Last edited: Mar 5, 2006
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Divergence of a tensor?