Index Notation, multiplying scalar, vector and tensor.

hellomrrobot
Messages
10
Reaction score
0
I am confused at why ##V_{i,j}V_{j,k}A_{km,i}## the result will end up being a vector (V is a vector and A is a tensor)

What are some general rules when you are multiplying a scalar, vector and tensor?
 
Physics news on Phys.org
Do you know of the Einstein summation convention?

Lets expand the product (which really is a contraction) you have given.

##V_{i,j}V_{j,k}A_{km,i} = (\partial_jV_i)\cdot (\partial_k V_j) \cdot (\partial_i A_{km})##

The summation convention I mentioned before states that repeated indices should be summed over.
For example ##V_i V_i = \sum_j V_jV_j## for a vector.(This is an alternative notation for the norm squared)

How to know what the resultant object is?
Well you find out what the free indices are. A free index is an index that isn't repeated.
A scalar has no free indices, a vector has one and a tensor 2 or more.
The power of the summation convention is that you can easily check calculations.
if you have an equation you need the free indices on both sides to be the same.

Another big advantage is the use of symmetry.
If you have 2 tensors ##A_{ij}\text{ and } S_{ij}## where A is antisymmetric in its indices and S symmetric you can show that ##A_{ij}S_{ij} = 0##.
 
Such an expression would usually be written as something like ## V_{,j}^{i}V_{,k}^{j}A_{km,i} ##. The summation convention then says to sum over all indices that appear exactly once raised and exactly once lowered. But if the metric tensor is the flat space, Euclidean one, then, e.g. ## V_i = V^i ## and we can use a "generalized summation convention", where indices that are repeated are summed over, regardless of whether they are raised or lowered. Notice then that all the indices in your expression are summed over except for ## m ##. The result is a quantity with one index, in this case a vector. If there were no "free" indices left, then you would have had a scalar. If there were two free indices left, you would have had a second rank tensor.
 
Geofleur said:
But if the metric tensor is the flat space, Euclidean one...
Not only do you need a flat space, you also have to be using Cartesian coordinates to be able to safely ignore the distinction between raised and lowered indices, right?
 
That's right!
 
Your objections are correct, it sometimes happens that people get lazy.
I've seen texts before where they use this sloppy version of the convention. (An example I encountere is the book Lie Algebras in particle physics by Howard Georgi)

Can you give some more context for that formula? Maybe we can shed a light on this. (and how you can check this whenever you encounter such a problem)
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. Towards the end of the first lecture for the Qiskit Global Summer School 2025, Foundations of Quantum Mechanics, Olivia Lanes (Global Lead, Content and Education IBM) stated... Source: https://www.physicsforums.com/insights/quantum-entanglement-is-a-kinematic-fact-not-a-dynamical-effect/ by @RUTA
I am reading WHAT IS A QUANTUM FIELD THEORY?" A First Introduction for Mathematicians. The author states (2.4 Finite versus Continuous Models) that the use of continuity causes the infinities in QFT: 'Mathematicians are trained to think of physical space as R3. But our continuous model of physical space as R3 is of course an idealization, both at the scale of the very large and at the scale of the very small. This idealization has proved to be very powerful, but in the case of Quantum...
Thread 'Lesser Green's function'
The lesser Green's function is defined as: $$G^{<}(t,t')=i\langle C_{\nu}^{\dagger}(t')C_{\nu}(t)\rangle=i\bra{n}C_{\nu}^{\dagger}(t')C_{\nu}(t)\ket{n}$$ where ##\ket{n}## is the many particle ground state. $$G^{<}(t,t')=i\bra{n}e^{iHt'}C_{\nu}^{\dagger}(0)e^{-iHt'}e^{iHt}C_{\nu}(0)e^{-iHt}\ket{n}$$ First consider the case t <t' Define, $$\ket{\alpha}=e^{-iH(t'-t)}C_{\nu}(0)e^{-iHt}\ket{n}$$ $$\ket{\beta}=C_{\nu}(0)e^{-iHt'}\ket{n}$$ $$G^{<}(t,t')=i\bra{\beta}\ket{\alpha}$$ ##\ket{\alpha}##...
Back
Top