Manipulating vector-matrices combinations

  • Thread starter Thread starter mysearch
  • Start date Start date
  • Tags Tags
    Combinations
mysearch
Gold Member
Messages
523
Reaction score
0
Hi,
I am trying to understand some of the basics of differential geometry in respect to general relativity. However, I not sure that I understand what seems to be a fairly fundamental bit of maths connected with the generic metric given in [1]. For simplicity, the notation has been reduced to 2-dimensional (x,y) coordinates:

[1] ds^2=g_{xy} dx^x dx^y

While I understand the geometry by which the separation (ds) can be expanded to the form in [2], I am unsure how matrix multiplication explains this process.

[2] ds^2 = g_{11} xx + g_{12} xy + g_{21} yx +g_{22} yy

In the simplest case, [2] collapses to Pythagoras via the values assigned to (g). For example, in 2D space, I was assuming (g=1,0,0,1) to be a 2x2 matrix, while (dx dy) might be described as 2x1 matrices that represent the component vectors of (ds) along the orthogonal axes (x,y), e.g.(x=3,0) and (y=0,4).

[3] ds^2 = \left(\begin{array}{cc}1&0\\0&1\end{array}\right) \left(\begin{array}{cc}3\\0\end{array}\right) \left(\begin{array}{cc}0\\4\end{array}\right)

While this logic, which is likely to be flawed, leads to the form in [3], I am not sure how multiplying the three matrices with the dimensions above leads to the form suggested by [2]. Therefore, I would much appreciate any clarification on offer. Thanks
 
Physics news on Phys.org
mysearch said:
Hi,
For simplicity, the notation has been reduced to 2-dimensional (x,y) coordinates:

[1] ds^2=g_{xy} dx^x dx^y
In that case, don't you mean

ds^2=g_{xx}(dx^x)^2+g_{yy}(dx^y)^2

or

ds^2=g_{xx}dx^2+g_{yy}dy^2

?

What you wrote looks like only one of the n2 terms of g_{ij}dx^i dx^j.

I don't quite understand what you're saying about matrix multiplication, so I'll just write a few comments about the expression ds^2=g_{ij}dx^i dx^j.

Do you understand this equality?

g=g_{ij} dx^i\otimes dx^j

If we define

dx^i dx^j=\frac{1}{2}(dx^i\otimes dx^j+dx^j\otimes dx^i)

and use this definition to rewrite the right-hand side of that equality, we get

g=g_{ij} dx^i\otimes dx^j=\frac{1}{2}(g_{ij}dx^i\otimes dx^j+g_{ji}dx^j\otimes dx^i)=g_{ij}dx^i dx^j

So if we define ds2=g, we have made the equality ds^2=g_{ij}dx^idx^j come true.

Edit: Also, consider this: If C is a curve and \dot C its velocity vector field, then

g(\dot C,\dot C)=g_{ij}\dot C^i\dot C^j

If we write the components of the velocity vector field as dxi/dt and think of dxi as an approximation of the change of the ith coordinate of C(t) when t changes by dt, we can write

g(\dot C,\dot C)dt^2=g_{ij}\frac{dx^i}{dt}\frac{dx^j}{dt}dt^2=g_{ij}dx^i dx^j

This suggests that the proper way to define the arc length L(C) of a curve C:[a,b]→M in a Riemannian manifold M is

L(C)=\int_a^b g_{C(t)}(\dot C(t),\dot C(t))dt

and this is how arc length is defined.
 
Last edited:
Fredrik,
Many thanks for your comments. I appreciate that I am probably not interpreting or using some of the notation correctly as the subject of differential geometry and tensors is quite new to me. So, at this stage, I am simply trying to establish some basic understanding of metrics and the metric tensors that underpin GR by plodding through the use of superscripts and subscripts in connection with contravariant and covariant coordinates. This led me into a discussion of orthogonal and oblique coordinates systems. Using just standard geometry and limiting this discussion to 2D, these systems seem to lead to the equation of the form:

[1] ds^2 = g_{11}x^1 x^1 + g_{12} x^1 x^2 + g_{21} x^2 x^1 +g_{22} x^2 x^2

While some of these terms collapse to zero in Pythagoras' theorem, they are present in other metrics. By and large this made sense to me. However, many texts then seem to describe [1] in terms of the following equation, which I have copied from your post:

[2] ds^2=g_{ij}dx^i dx^j

Given that both equate to (ds) I assume that the RHS of [1] is an expanded form of the RHS of [2] in the case of a 2D solution:

[3] g_{ij}dx^i dx^j \equiv g_{11}x^1 x^1 + g_{12} x^1 x^2 + g_{21} x^2 x^1 +g_{22} x^2 x^2

However, I don’t understand what rules drive this expansion or whether it means something completely different.
Fredrik said:
In that case, don't you mean
ds^2=g_{xx}(dx^x)^2+g_{yy}(dx^y)^2
or
ds^2=g_{xx}dx^2+g_{yy}dy^2?
What you wrote looks like only one of the n2 terms of g_{ij}dx^i dx^j.
Quite possibly, but where did the power of 2 appear from based on [2] above? Does it fall out of the summation notation suggested below in [4]?

[4] ds^2=g_{ij}dx^i dx^j = ds^2=g_{ij} \sum dx^i \sum dx^j
Fredrik said:
Do you understand this equality?
g=g_{ij} dx^i\otimes dx^j
No, I have never used this notation, but it seems to be used in connection with tensor multiplication (?). However, if possible, I would really like to try to resolve my initial misunderstandings first, before proceed into to yet more unknowns:redface:
 
mysearch,

You're mostly on track. Starting with your [2]

ds^2 = g_{11}x^1 x^1 + g_{12} x^1 x^2 + g_{21} x^2 x^1 +g_{22} x^2 x^2,

we can write this more compactly as the double sum

ds^2 = \sum_i \sum_j g_{ij}x^i x^j,

and then even more compactly by just inferring the sums from the repeated indices

ds^2=g_{ij}dx^i dx^j.

The connection with matrix notation is

<br /> \left[ \begin{array}{cc} x^1 &amp; x^2 \end{array} \right]<br /> \left[ \begin{array}{cc} g_{11} &amp; g_{12} \\ g_{21} &amp; g_{22} \end{array} \right]<br /> \left[ \begin{array}{c} x^1 \\ x^2 \end{array} \right]<br /> = g_{11}x^1 x^1 + g_{12} x^1 x^2 + g_{21} x^2 x^1 +g_{22} x^2 x^2<br />

but I wouldn't say it explains things. It is just another way to write it.
 
Uncorrelated said:
[1] ds^2 = g_{11}x^1 x^1 + g_{12} x^1 x^2 + g_{21} x^2 x^1 +g_{22} x^2 x^2,

[2] ds^2 = \sum_i \sum_j g_{ij}x^i x^j,

[3] ds^2=g_{ij}dx^i dx^j.
Many thanks for the excellent summary in post #4. I had managed to work through most of the ever-increasing shorthand styles of notation that finishes up in the form of [3] above. However, I had lost my way when trying to re-interpret [3] back towards [1] via matrix multiplication.
Uncorrelated said:
[4] <br /> \left[ \begin{array}{cc} x^1 &amp; x^2 \end{array} \right]<br /> \left[ \begin{array}{cc} g_{11} &amp; g_{12} \\ g_{21} &amp; g_{22} \end{array} \right]<br /> \left[ \begin{array}{c} x^1 \\ x^2 \end{array} \right]<br /> = g_{11}x^1 x^1 + g_{12} x^1 x^2 + g_{21} x^2 x^1 +g_{22} x^2 x^2<br />
This is the step I had mis-understood. I had seen a vector described as a vertical [2x1] matrix and had assumed that this orientation was a requirement for both of vectors implied in [2]. I had also come across the rule that 2 matrices can only be multiplied together if the number of columns in the first is the same as the number of rows in the second. If this is the case, would the ordering of the vector matrices be important?

\left[2x1 \right] \left[1x2 \right] = \left[2x2 \right]

\left[ \begin{array}{c} x^1 \\ x^2 \end{array} \right]<br /> \left[ \begin{array}{cc} x^1 &amp; x^2 \end{array} \right]<br /> = x^1 x^1 + x^2 x^1 + x^1 x^2 + x^2 x^2

Presumably, if this is correct, the resulting [2x2] matrix would then be multiplied by (g), another [2x2] matrix.
Uncorrelated said:
but I wouldn't say it explains things. It is just another way to write it.

I couldn’t agree more. In many respect, the form of equation [2] seems to be the important aspect, which depends on the coordinate system, i.e. number of dimensions and the nature of space/spacetime assumed. While I appreciate that differential geometry may be a very useful mathematical tool, it appears to lead to so much abstracted notation it is quite easy to lose sight of any physical reality. Well, that’s my excuse anyway:smile:
 
Last edited:
Yes, when you write things in matrix form the order and whether vectors are row or column matters and you get different kinds of multiplication. The basic one is the inner product which results in a number

<br /> \left[ \begin{array}{cc} x^1 &amp; x^2 \end{array} \right]<br /> \left[ \begin{array}{c} y^1 \\ y^2 \end{array} \right]<br /> = x^1 y^1 + x^2 y^2<br />
.

Then there is the outer product (I'm not sure everyone uses the same convention for this) which results in a matrix

<br /> \left[ \begin{array}{c} x^1 \\ x^2 \end{array} \right]<br /> \left[ \begin{array}{cc} y^1 &amp; y^2 \end{array} \right]<br /> = \left[ \begin{array}{cc} x^1 y^1 &amp; x^1 y^2 \\ x^2 y^1 &amp; x^2 y^2 \end{array}\right]<br />
.

If you think of the matrix as a stack of row vectors or a bunch of column vectors lined up

<br /> \left[ \begin{array}{cc} x^{11} &amp; x^{12} \\ x^{21} &amp; x^{22} \end{array}\right]<br /> =<br /> \begin{array}{c} <br /> \left[ \begin{array}{cc} x^{11} &amp; x^{12} \end{array} \right] \\ <br /> \left[ \begin{array}{cc} x^{21} &amp; x^{22} \end{array} \right] <br /> \end{array}<br /> =<br /> \begin{array}{cc}<br /> \left[ \begin{array}{c} x^{11} \\ x^{21} \end{array} \right] &amp;<br /> \left[ \begin{array}{c} x^{12} \\ x^{22} \end{array} \right]<br /> \end{array}<br />

then everything is just repeated application of these two multiplications.

--
Uncorrelated

mysearch said:
This is the step I had mis-understood. I had seen a vector described as a vertical [2x1] matrix and had assumed that this orientation was a requirement for both of vectors implied in [2]. I had also come across the rule that 2 matrices can only be multiplied together if the number of columns in the first is the same as the number of rows in the second. If this is the case, would the ordering of the vector matrices be important?

\left[2x1 \right] \left[1x2 \right] = \left[2x2 \right]

\left[ \begin{array}{c} x^1 \\ x^2 \end{array} \right]<br /> \left[ \begin{array}{cc} x^1 &amp; x^2 \end{array} \right]<br /> = x^1 x^1 + x^2 x^1 + x^1 x^2 + x^2 x^2

Presumably, if this is correct, the resulting [2x2] matrix would then be multiplied by (g), another [2x2] matrix.
 
mysearch said:
However, I had lost my way when trying to re-interpret [3] back towards [1] via matrix multiplication.
Is this the definition of matrix multiplication you're using: (AB)ij=AikAkj? (It should be).

mysearch said:
I had also come across the rule that 2 matrices can only be multiplied together if the number of columns in the first is the same as the number of rows in the second.
Note that this is implied by the definition (AB)ij=AikAkj.

mysearch said:
Using just standard geometry and limiting this discussion to 2D, these systems seem to lead to the equation of the form:

[1] ds^2 = g_{11}x^1 x^1 + g_{12} x^1 x^2 + g_{21} x^2 x^1 +g_{22} x^2 x^2
You shouldn't use the d(something) notation on the left if you're not going to do it on the right. But you're right about how to apply the summation convention. The right-hand side is g_{ij}x^iy^j.

mysearch said:
Quite possibly, but where did the power of 2 appear from based on [2] above? Does it fall out of the summation notation suggested below in [4]?

[4] ds^2=g_{ij}dx^i dx^j = ds^2=g_{ij} \sum dx^i \sum dx^j
ds^2 is defined by ds^2=g_{ij}dx^idx^j.

mysearch said:
No, I have never used this notation, but it seems to be used in connection with tensor multiplication (?). However, if possible, I would really like to try to resolve my initial misunderstandings first, before proceed into to yet more unknowns:redface:
In that case, it's probably impossible to discuss the metric tensor. The best we can do is to discuss norms defined from inner products defined from matrix multiplication. Let g be an arbitrary non-singular symmetric 2×2 matrix, and define the inner product on the vector space of 2×1 matrices by

\langle x,y\rangle=x^Tgy

and the norm by

\|x\|=\sqrt{\langle x,x\rangle}

(You should verify that the above defines an inner product and a norm). These definitions imply

x^2=\langle x,x\rangle=x^Tgx=x_ig_{ij}x_j=g_{ij}x_i x_j

If g=I, we get

x^2=\delta_{ij}x_ix_j=(x_1)^2+(x_2)^2
 
Fredrik & Uncorrelated,
Really appreciate all the help as some of the pieces now make more sense. However, I will try to work through all the information provided and read a little more into the maths and notation behind the metric tensor. Clearly, all the ideas contained in differential geometry is going to take a bit more work on my part.:cry:

Anyway, many thanks and a Merry Xmas.:smile:
 

Similar threads

Replies
19
Views
2K
Replies
7
Views
3K
Replies
4
Views
4K
Replies
7
Views
1K
Replies
4
Views
2K
Back
Top