Basis vectors, again

1. Oct 24, 2008

snoopies622

For an ordinary vector V, the square of its length is $$V \cdot V = V^a V_a$$.

For basis vectors, $$e^a \cdot e_b = \delta ^a _b$$ so $$e^a \cdot e_a = 1$$.

Since $$1^2 = 1$$, this implies that every basis vector is of unit length.

What is my mistake?

2. Oct 24, 2008

waht

it's fine

3. Oct 25, 2008

atyy

Using the formula on top, the length of a basis vector is $$e_a \cdot e_a$$

The equation for unit vectors perpendicular to each other is $$e_a \cdot e_b = \delta _a _b$$

The equation for non-unit non-perpendicular vectors is ea.eb= gab

Given a set of basis vectors, the dual basis covectors are defined by $$e^a \cdot e_b = \delta ^a _b$$

Or something like that, I can never quite remember which indices go up or down.

Last edited: Oct 25, 2008
4. Oct 25, 2008

HallsofIvy

Staff Emeritus
In order to say $$e^a \cdot e_b = \delta ^a _b$$ or $$e^a \cdot e_a = 1$$ you have to assume an orthonormal basis. Essentially what you are saying is "Assuming basis vectors have unit length, then every basis vector is of unit length"!

5. Oct 25, 2008

snoopies622

I did not know this. If my basis is not orthonormal and I have chosen my (say) countervariant basis vectors, how do I find their covariant counterparts? Does one raise or lower indices in the same way as with ordinary vectors/tensors, or is it different with basis vectors?

For example, suppose I have a two-dimensional manifold and a coordinate chart (u,v) with metric $$g_{uu}=2$$ $$g_{uv}=g_{vu}=-2$$ $$g_{vv}=4$$ and I choose $$e^u =<1,0>$$ $$e^v =<0,1>$$. How do I find $$e_u$$ and $$e_v$$?

6. Oct 25, 2008

HallsofIvy

Staff Emeritus
I mispoke before because I did not realize that you were talking about a product of "vectors" and "co-vectors" or the dual space.

The dual space of a vector space, V, is the set of linear functions from V to its underlying field.

In fact, given any basis for a vector space, the the corresponding basis for the dual space is defined by "ei(ej)= 1" and then that is used to define the "dot product". However, that dot product depends upon the choice of basis! In that sense, yes, every basis vector has unit length: the choice of dot product based on that basis guarentees that.

If you have a given dot product (perhaps based on some basis) and use it with a different basis, It does not follow that $e^i\cdot e_j= \delta^i_j$

$e_i= g_{ij}e^j$. In the case you give, it is just a matrix multiplication:
$$e^u= \left[\begin{array}{cc}2 & -2 \\ -2 & 2\end{array}\right]\left[\begin{array}{c}1 \\ 0\end{array}\right]= <2, -2>$$
Similarly, $e^v$ is <-2, 2>.

7. Oct 25, 2008

snoopies622

8. Oct 25, 2008

atyy

Yes, the "reciprocal basis" is the same as a "dual basis". The reciprocal basis only exists when you have already defined one basis.

9. Oct 25, 2008

snoopies622

A clarification, please: According to the description here

http://home.pacbell.net/bbowen/covariant.htm

And the drawing here

http://en.wikipedia.org/wiki/Image:Basis.gif

when one raises or lowers the index on a vector/covector, one is still describing the same arrow, only using different ‘building block’ arrows to do so. But when one raises or lowers the index on a basis-vector/basis-covector, one is actually turning it into a different arrow – one that goes from being parallel to a coordinate line to one that is perpendicular to the other coordinate line(s) or vice versa.

Granted that vectors/co-vectors are not really arrows, is this otherwise correct?

10. Oct 26, 2008

atyy

Using Bowen's notation and formulas, if V=A1, its contravariant components are [V1=1, V2=0]. To figure out its covariant components we lower its index:

A1 = V1A1+V2A2 = V1A1+V2A2 = Vi*Gi1A1+Vi*Gi2A2
= (V1*G11+V2*G21)A1+(V1*G12+V2*G22)A2
= G11A1+G12A2

So if {A1,A2} are the basis vectors, their contravariant components are {[1,0], [0,1]}, and their covariant components are {[G11,G12]c,[G21,G22]c}, so that Ai.Aj=Gij

The sets of basis vectors {A1,A2} and {A1,A2} are reciprocal, but lowering the index on A1 doesn't change it into A1, it still remains itself described in terms of the reciprocal basis vectors, just as with any other vector.

The process of getting a set of reciprocal basis vectors given a set of basis vectors is a different process from raising or lowering an index.

(Actually, I usually think of vectors and covectors as being in different spaces. The vectors are column vectors and the covectors are row vectors. You can multiply a column and row vector to get a number without any metric. Without a metric, the column vectors and row vectors are not related, unless we define a basis for the column vectors and a reciprocal basis for the row vectors, which means the relationship between column vectors and row vectors changes with the basis. Without a metric you cannot multiply two column vectors to get a number. Once you have a metric, you can use that to multiply two column vectors to get a number, by using it to change one column vector into a row vector. You can also use the metric to enforce a fixed relationship between column vectors and row vectors, so you can identify a vector with its covector.)

11. Oct 26, 2008

snoopies622

So an expression like $$e^u = <1,0>$$ is ambiguous since the raised u could mean either that the components of the u basis vector (or covector) are contravariant, or that the basis vector itself is contravariant and the components are either contravariant or covariant?

(Edit: I guess "expression" should be "equation".)

Last edited: Oct 26, 2008
12. Oct 26, 2008

HallsofIvy

Staff Emeritus
Yes, that's true.

13. Oct 26, 2008

snoopies622

So does $$\frac {\partial}{\partial t}$$ represent the either contravariant or covariant components of a covariant basis vector, or the covariant components of a basis vector that could be either contravariant or covariant?

Last edited: Oct 26, 2008
14. Oct 27, 2008

atyy

Given coordinates $$\{t,x\}$$ on a manifold, the basis vectors can be chosen to be $$\{e_0={\frac {\partial}{\partial t}},e_1={\frac {\partial}{\partial x}}\}$$, and they can be used to represent any vector using contravariant components: $$v=v^ie_i=v^i\frac {\partial}{\partial x_i}=v^i$$ , where in the final step I omitted the basis vectors only for notational convenience as is often done.

15. Oct 28, 2008

snoopies622

Wait, it's not ambiguous. If $$e^u = <1,0>$$ then it is being expressed in terms of the eu and ev basis vectors. eu=(1)(eu)+(0)(ev). I may actually understanding this now.

Last edited: Oct 28, 2008
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?