Prove that If A,B are 3x3 tensors, then the matrix C=AB is also a tensor

Click For Summary
The discussion revolves around proving that the product of two 3x3 tensors, represented as matrices, results in another tensor of the same dimensions. Participants express confusion about the interpretation of tensors and their matrix representations, with some suggesting that the proof should demonstrate the tensor transformation properties. There is a debate on whether the question requires showing that the product of square matrices remains a square matrix or if it involves more complex tensor properties. Clarification from the original poster is needed to better understand the specific requirements of the proof. Overall, the conversation highlights the complexities of tensor mathematics and the need for precise definitions.
ReuvenD10
Messages
8
Reaction score
1
Homework Statement
Prove that If A,B are 3x3 tensors, then the matrix C=AB is also tensor
Relevant Equations
the equations below in my solution
I try to solve but i have 1 step in the solution that I don't understand who to solve.

Below in the attach files you can see my solution, the step that I didn't make to prove Marked with a question mark.

thanks for your helps (:
 

Attachments

Physics news on Phys.org
It would help a lot if you would type in your question here (see how you can type formulas on PF: https://www.physicsforums.com/help/latexhelp/) instead of forcing people to download your pdf. An explanation what a ##3\times 3## tensor should be, if not a ##3 \times 3## matrix would be helpful, too.

To me it reads as:
Show that the product of two square matrices of equal size is again a square matrix of the same size.
 
fresh_42 said:
To me it reads as:
Show that the product of two square matrices of equal size is again a square matrix of the same size.

It's funny, I interpreted it slightly differently. That ##\mathcal{A}, \mathcal{B}, \mathcal{C}## are some rank-2 and 3-dimensional tensors and we are asked to prove the tensor transformation properties, i.e. to show that if an equation in matrix representation ##[\mathcal{C}]_{\beta_1} = [\mathcal{A}]_{\beta_1} [\mathcal{B}]_{\beta_1} ## holds with respect to basis ##\beta_1## then ##[\mathcal{C}]_{\beta_2} = [\mathcal{A}]_{\beta_2} [\mathcal{B}]_{\beta_2} ## holds with respect to basis ##\beta_2##. So for instance you have$$
\begin{align*}
\bar{c}_{\mu \nu} &= {\bar{a}_{\mu}}^{\gamma} \bar{b}_{\gamma \nu} = ({T^{\rho}}_{\mu} {T_{\sigma}}^{\gamma} {a_{\rho}}^{\sigma})({T^{\alpha}}_{\gamma}{T^{\beta}}_{\nu} b_{\alpha \beta}) \\

&= {T^{\rho}}_{\mu} {T^{\beta}}_{\nu} {a_{\rho}}^{\alpha} b_{\alpha \beta} \\

&= {T^{\rho}}_{\mu} {T^{\beta}}_{\nu} c_{\rho \beta}

\end{align*}
$$where the ##{T^i}_j## are the transformation coefficients from ##\beta_1## to ##\beta_2##.
 
What is a matrix representation of a tensor? That doesn't make sense. A matrix is already a tensor. And any tensor other than ##\sum u_k\otimes v_k## isn't a matrix. ##3## by ##3## makes only sense for matrices. The rank should be completely irrelevant here.
 
Sure, yes I'm still doing some mental gymnastics to try and understand what is required. Generally a tensor doesn't have a 'matrix representation', but you can naturally map rank-2 tensors in ##n##-dimensional space to ##n \times n## matrices, for ease of computation.

Anyway that's just how I interpreted it, I guess we need to wait for OP to explain what the question actually is asking.
 
Last edited by a moderator:

Similar threads

  • · Replies 12 ·
Replies
12
Views
3K
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 40 ·
2
Replies
40
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
6
Views
1K