What is the meaning of \textbf{nn} in matrix multiplication?

Click For Summary

Discussion Overview

The discussion revolves around the interpretation of the expression \textbf{nn} in the context of matrix multiplication, particularly in relation to a mathematical paper. Participants are trying to clarify the meaning of \textbf{nn} and its implications for matrix operations involving vectors and matrices.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about the meaning of \textbf{nn} in the expressions involving vectors \textbf{u}, \textbf{U}, and the identity matrix \textbf{I}, questioning whether it refers to matrix multiplication.
  • Another participant suggests that the dot in the equations indicates multiplication of matrices, but is uncertain about the distinction between row and column vectors.
  • A different participant proposes that \textbf{nn} could represent the outer product of the vector \textbf{n} with itself, resulting in a 3x3 matrix.
  • One participant mentions that the notation \textbf{e} represents the rate of strain tensor, which is a 3x3 matrix derived from the gradient of the vector \textbf{u}.
  • Another participant points out that the notation \textbf{nn} could be interpreted as a summation over indices, suggesting a specific form for the matrix multiplication involving \textbf{n}.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the meaning of \textbf{nn} or its implications for the matrix operations discussed. Multiple interpretations and uncertainties remain regarding the notation and its application.

Contextual Notes

Participants note that the context of the original paper may provide additional insights, but there are unresolved questions about the definitions and implications of the notation used.

rsq_a
Messages
103
Reaction score
1
I can't figure out what an author means by this expression:

<br /> \textbf{n} \cdot \textbf{e} \cdot (\textbf{I} - \textbf{nn})<br />

and

<br /> \left(\textbf{u} - \textbf{U}\right) \cdot (\textbf{I} - \textbf{nn})<br />

Here, all I know is that \textbf{u} and \textbf{U} are vectors of length 3. \textbf{n} is a unit normal, so also a vector of length 3. \textbf{I} I'm assuming is a 3x3 identity matrix. The author has also written that \textbf{e} = 1/2 (\nabla \textbf{u} + (\nabla \textbf{u})^T), so I guess that's a 3x3 matrix.

But that what does \textbf{nn} even mean? \textbf{n}n^T makes sense to me (giving a 3x3 matrix).

But then what does it mean to take the dot product of a 3x3 matrix with a 3x3 matrix? Is the author simply referring to matrix multiplication in
<br /> \left(\textbf{u} - \textbf{U}\right) \cdot (\textbf{I} - \textbf{nn}) = \left(\textbf{u} - \textbf{U}\right)(\textbf{I} - \textbf{nn}^T)<br />
 
Physics news on Phys.org
Why are you so cryptic about "an author"? Telling us what is the exact context of all this may save our time.
 
arkajad said:
Why are you so cryptic about "an author"? Telling us what is the exact context of all this may save our time.

I didn't see it as relevant. The equation(s) can be found on p.5 http://www.maths.nottingham.ac.uk/personal/pmzjb1/ejam_new.pdf.
 
From other formulas in thise paper lot can be guessed. The dot means simply multiplication of one matrix by another matrix, for instance vector.matrix=vector. I am not sure whether there is a difference between row and columns vectors, but I guess there is one.

I could not decode

\textbf{e} = 1/2 (\nabla \textbf{u} + (\nabla \textbf{u})^T)

but that can be decoded looking somewhere else for "rate of strain tensor".
 
arkajad said:
From other formulas in thise paper lot can be guessed.

Where? In (2.4) and (2.6), for example, dot is used consistently to mean the inner product (i.e. vector and vector).

\textbf{e} = 1/2 (\nabla \textbf{u} + (\nabla \textbf{u})^T)

This is easy. It's a 3x3 vector. The gradient of a vector is the transpose of the jacobian.

What about \textbf{nn}? I asked before how it makes sense to put two 3x1 (or 1x3) vectors together. Moreover, if you are correct, and
 
I guess \mathbf{nn} is the 3x3 matrix n_in_j. So, for instance, \mathbf{u}\cdot\mathbf{nn} would be

\Sigma_i u_in_in_j
 

Similar threads

  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K