Prove that If A,B are 3x3 tensors, then the matrix C=AB is also a tensor

ReuvenD10
Messages
8
Reaction score
1
Homework Statement
Prove that If A,B are 3x3 tensors, then the matrix C=AB is also tensor
Relevant Equations
the equations below in my solution
I try to solve but i have 1 step in the solution that I don't understand who to solve.

Below in the attach files you can see my solution, the step that I didn't make to prove Marked with a question mark.

thanks for your helps (:
 

Attachments

Physics news on Phys.org
It would help a lot if you would type in your question here (see how you can type formulas on PF: https://www.physicsforums.com/help/latexhelp/) instead of forcing people to download your pdf. An explanation what a ##3\times 3## tensor should be, if not a ##3 \times 3## matrix would be helpful, too.

To me it reads as:
Show that the product of two square matrices of equal size is again a square matrix of the same size.
 
fresh_42 said:
To me it reads as:
Show that the product of two square matrices of equal size is again a square matrix of the same size.

It's funny, I interpreted it slightly differently. That ##\mathcal{A}, \mathcal{B}, \mathcal{C}## are some rank-2 and 3-dimensional tensors and we are asked to prove the tensor transformation properties, i.e. to show that if an equation in matrix representation ##[\mathcal{C}]_{\beta_1} = [\mathcal{A}]_{\beta_1} [\mathcal{B}]_{\beta_1} ## holds with respect to basis ##\beta_1## then ##[\mathcal{C}]_{\beta_2} = [\mathcal{A}]_{\beta_2} [\mathcal{B}]_{\beta_2} ## holds with respect to basis ##\beta_2##. So for instance you have$$
\begin{align*}
\bar{c}_{\mu \nu} &= {\bar{a}_{\mu}}^{\gamma} \bar{b}_{\gamma \nu} = ({T^{\rho}}_{\mu} {T_{\sigma}}^{\gamma} {a_{\rho}}^{\sigma})({T^{\alpha}}_{\gamma}{T^{\beta}}_{\nu} b_{\alpha \beta}) \\

&= {T^{\rho}}_{\mu} {T^{\beta}}_{\nu} {a_{\rho}}^{\alpha} b_{\alpha \beta} \\

&= {T^{\rho}}_{\mu} {T^{\beta}}_{\nu} c_{\rho \beta}

\end{align*}
$$where the ##{T^i}_j## are the transformation coefficients from ##\beta_1## to ##\beta_2##.
 
What is a matrix representation of a tensor? That doesn't make sense. A matrix is already a tensor. And any tensor other than ##\sum u_k\otimes v_k## isn't a matrix. ##3## by ##3## makes only sense for matrices. The rank should be completely irrelevant here.
 
Sure, yes I'm still doing some mental gymnastics to try and understand what is required. Generally a tensor doesn't have a 'matrix representation', but you can naturally map rank-2 tensors in ##n##-dimensional space to ##n \times n## matrices, for ease of computation.

Anyway that's just how I interpreted it, I guess we need to wait for OP to explain what the question actually is asking.
 
Last edited by a moderator:
Prove $$\int\limits_0^{\sqrt2/4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx = \frac{\pi^2}{8}.$$ Let $$I = \int\limits_0^{\sqrt 2 / 4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx. \tag{1}$$ The representation integral of ##\arcsin## is $$\arcsin u = \int\limits_{0}^{1} \frac{\mathrm dt}{\sqrt{1-t^2}}, \qquad 0 \leqslant u \leqslant 1.$$ Plugging identity above into ##(1)## with ##u...
Back
Top