Regarding Basis and Dual Basis


by simpleton
Tags: basis, dual
simpleton
simpleton is offline
#1
Jan21-13, 01:12 PM
P: 59
Hi, I'm learning about vector spaces and I would like to ask some questions about it.

Suppose I have a vector space [itex]V[/itex], and a basis for [itex]V \{v_1, ... v_n\}[/itex]. Then there is a dual space [itex]V^*[/itex] consisting all linear functions whose domain is [itex]V[/itex] and range is ℝ. Then the space [itex]V^*[/itex] has a dual basis [itex]\{x_1 ... x_n\}[/itex], and one way of constructing this dual basis is to let [itex]x_i[/itex] be a function that returns the [itex]i^{th}[/itex] coordinate of the vector when expressed in the given basis for [itex]V[/itex]. The claim is that the span of these functions is equal to [itex]V^*[/itex], and therefore the space and dual space have the same dimension.

I have a few questions. My textbook states that any linear function on [itex]V[/itex] can be expressed as a linear combination of the coordinate functions, but it does not explain why. May I know why this is the case? I am assuming that this is because a linear function on a vector must be a linear combination of its coordinates, but I'm not sure if this must always be the case, since I think a linear function is merely defined to be something that is closed under addition and multiplication under scalars.

The textbook also says that the coordinate functions are linear independent. By this, I think they mean that if there exists a linear combination of the coordinate functions such that any input vector returns 0 on the equation, then the coefficients of all the coordinate functions must be 0. I think this makes sense, but may I know if I interpreted it correctly?

Finally, it is mentioned that given any dual basis [itex]\{x_1...x_n\}[/itex] of [itex]V^*[/itex], you can construct a basis for [itex]V[/itex], such that the dual basis acts as coordinate functions for the basis of [itex]V[/itex]. By this, I think they mean that I can find a set [itex]\{v_1...v_n\}[/itex] such that [itex]x_i(v_j) = 1[/itex] iff [itex]i = j[/itex] and [itex]x_i(v_j) = 0[/itex] otherwise. But I'm not sure why this is true. I think that in order for this to be true, the following questions have to be answered.

1) Such a set [itex]\{v_1...v_n\}[/itex] exists.
2) Any vector in [itex]V[/itex] can be expressed as a linear combination of the vectors.
3) The constructed vectors are linearly independent.

I am not sure how 1) can be answered. I think 2) and 3) are actually the same question because we know that both [itex]V[/itex] and [itex]V^*[/itex] have the same dimension, and therefore both 2) and 3) have to be true at the same time so that the Linear Dependence Lemma is not violated. I am not sure how to prove 2), but I think I can prove 3). Suppose a linear combination exists, so [itex]\Sigma a_iv_i = 0[/itex] for some [itex]a_1...a_n[/itex]. But if I apply [itex]x_i[/itex] to this linear combination, I will get [itex]a_i = 0[/itex] so in the end I deduce that all the coefficients are 0. May I know if there is a way to prove 2) directly?

Thank you very much!
Phys.Org News Partner Science news on Phys.org
Better thermal-imaging lens from waste sulfur
Hackathon team's GoogolPlex gives Siri extra powers
Bright points in Sun's atmosphere mark patterns deep in its interior
Tenshou
Tenshou is offline
#2
Jan21-13, 08:06 PM
P: 150
http://www.youtube.com/watch?v=8M0LS...2ED373&index=2

Hope this video helps.
HallsofIvy
HallsofIvy is offline
#3
Jan23-13, 07:40 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,881
Let V be a vector space of dimension n. Then it has a basis [itex]\{x_1, x_2, \cdot\cdot\cdot, x_n\}[/itex]. For i from 1 to n, we define [itex]f_i[/itex] by requiring that [itex]f_i(x_j)= \delta_{ij}[/itex]. That is, [itex]f_i(x_j)[/itex] is equal to 1 if i= j, 0 if not. We define [itex]f_i[/itex] for any vector in V "by linearity": if [itex]x= a_1x_1+ a_2x_2+ \cdot\cdot\cdot+ a_nx_n[/itex] then [itex]f_i(x)= a_1f_i(x_1)+ a_2f_i(x_2)+ \cdot\cdot\cdot+ a_nf_i(x_n)= a_i[/itex].

Now, we show that this set of linear functions is a basis for V*. Suppose f is any linear function from V to its underlying field. For all i from 0 to n, let [itex]a_i= f(x_i)[/itex]. Then it is easy to show that [itex]f(x)= a_1f_1(x)+ a_2f_2(x)+ \cdot\cdot\cdot+ a_nf_n(x)[/itex] so that [itex]f= a_1f_1+ a_2f_2+ \cdot\cdot\cdot+ a_nf_n[/itex]. That is, this set of n linear functionals spans the entire space of all linear functionals on V.

Of course the linear functional f that takes all vectors in V to 0, in particular, takes all of the basis vectors, [itex]x_i[/itex], to 0 and so is [itex]f= 0f_1+ 0f_2+ \cdot\cdot\cdot+ 0f_n[/itex], showing that these functionals are independent.


(By the way, all of this requires that V be finite dimensional. Infinite dimensional vector spaces are not necessarily isomorphic to their duals.)

simpleton
simpleton is offline
#4
Jan25-13, 01:14 AM
P: 59

Regarding Basis and Dual Basis


Hi HallsofIvy,

Thank you very much for your reply! I understand now how the linear functions work. Could you help me with the second part as well? How do you prove that you can find a corresponding basis [itex]\{v_1...v_n\}[/itex] exists given the basis of the dual space?
Fredrik
Fredrik is offline
#5
Jan25-13, 01:21 AM
Emeritus
Sci Advisor
PF Gold
Fredrik's Avatar
P: 8,992
This is how I like to do these things: First a few comments about notation.

I use the convention that basis vectors of V are written with indices downstairs, and components of vectors of V are written with indices upstairs. So we can write a basis for V as ##\{e_i\}##, and if x is in V, we have ##x=\sum_i x^i e_i##. But I'll write this as ##x=x^i e_i##. The idea is that since there's always a sum over the indices that occur twice, and never a sum over other indices, it's a waste of time and space to type summation sigmas.

For members of V*, the convention for indices is the opposite. We'll write a basis as ##\{e^i\}##, and if x is in V*, we'll write ##x=x_i e^i##.

Since you're new at this, you may find it difficult to keep track of which symbols denote members of V and which symbols denote members of V*, so for your benefit, I will write members of V as ##\vec x## and members of V* as ##\tilde x##. I found this notation helpful when I was learning this stuff, but I don't use it anymore. By the way, I'm going to omit some "for all" statements, in particular "for all i" and "for all j". I hope it will be obvious when I'm doing so.

Let ##\{\vec e_i\}## be an arbitrary basis for V. I like to define the dual basis ##\{\tilde e^i\}## by saying that for each i, ##\tilde e^i## is the member of V* such that ##\tilde e^i(\vec e_j)=\delta^i_j##. For all ##\vec x\in V##, we have
$$\tilde e^i(\vec x)=\tilde e^i(x^j\vec e_i)=x^j\tilde e^i(\vec e_j)=x^j\delta^i_j=x^i.$$ To prove that ##\{\tilde e^i\}## is a basis for V*, we must prove that it's a linearly independent set that spans V*. To see that it spans V*, let ##\tilde y\in V^*## be arbitrary. For all ##\vec x\in V##,
$$\tilde y(\vec x)=\tilde y(x^i\vec e_i)=x^i\tilde y(\vec e_i)=\tilde e^i(\vec x)\tilde y(\vec e_i) =\tilde y(\vec e_i)\tilde e^i(\vec x) =\big(\tilde y(\vec e_i)\tilde e^i\big)(\vec x).$$ So ##\tilde y=\tilde y(\vec e_i)\tilde e^i##. To see that ##\{\tilde e^i\}## is linearly independent, suppose that ##a_i\tilde e^i=0##. Then ##(a_i\tilde e^i)(\vec x)## for all ##\vec x\in V##. This implies that ##(a_i\tilde e^i)(\vec e_j)=0## for all j. So for all j,
$$0=(a_i\tilde e^i)(\vec e_j) =a_i \tilde e^i(\vec e_j)=a_i\delta^i_j=a_j.$$ So all the ##a_i## are 0.

Now let ##\{\tilde e^i\}## be an arbitrary basis for V*. We want to use this to construct a basis ##\{\vec e_i\}## for V such that ##\tilde e^i(\vec e_j)=\delta^i_j##. Maybe there's a more direct approach, but this is the one I see immediately:

Let's write members of V** (the dual of V*) as ##\overleftarrow x##. We will write the dual basis of ##\{\tilde e_i\}## as ##\{\overleftarrow e_i\}##. Define a function ##f:V\to V^{**}## by ##f(\vec x)(\tilde y)=\tilde y(\vec x)##. (Since ##f(\tilde x)## is in V**, it takes a member of V* as input). It's not hard to show that this f is an isomorphism. Then we can define the basis for V by ##\vec e_i=f^{-1}(\overleftarrow e_i)##. Then we just verify that everything works out as intended.
lavinia
lavinia is offline
#6
Jan25-13, 04:57 PM
Sci Advisor
P: 1,716
The vector space is isomorphic to the dual space of the dual space. Any vector v defines the linear map on dual vectors, l -> l(v).

The dual basis of the dual basis is the basis back again.


Register to reply

Related Discussions
Relationship of Basis to Dual Basis Linear & Abstract Algebra 1
Dual basis and differential forms... Linear & Abstract Algebra 5
Dual basis in Minkowski space Special & General Relativity 2
Dual Basis Linear & Abstract Algebra 5