1. The problem statement, all variables and given/known data Let V be a vector space, and suppose {v_1,...,v_n} (all vectors) form a basis for V. Let V* denote the set of all linear transformations from V to R. (I know from previous work that V* is a vector space). Define f_i as an element of V* by: f_i(a_1*v_1 + a_*v_2 + ... +a_n*v_n) = a_i Prove that {f_1,...,f_n} gives a basis for V*. 2. Relevant equations A basis of a subspace is a set of vectors that are linearly independent and span the subspace. 3. The attempt at a solution By a theorem from my book, I know that for two subspaces V and W, dim V = dim W implies V = W. Since the basis for V has n vectors, dim(V) = n. Also, dim(R^n) = n. Thus, V can be equated with R^n. Since all vectors in V are in R^n, any linear transformation from V to R must have a corresponding 1xn matrix, so we know that f_1 through f_n have corresponding 1xn matrices. Since we can take the transpose of these matrices to form vectors in R^n, we can easily equate V* to R^n, so dim V* = n. This is another theorem from the book: Let V be any k-dimensional subspace in R^n. Then any k vectors that span V must be linearly independent and any k linearly independent vectors in V must span V. Now to finish up, I need to show that either {f_1,...,f_n} span V or they are linearly independent. I'm having trouble with either of those, though, since I don't have an explicit matrix form for any of the functions. All I know is that f_i gives the ith coordinate of a vector in V. I can prove linear independence by showing that: c_1*[f_1] + c_2*[f_2] + ... + c_n*[f_n] = [0] implies c_1 = c_2 = ... = c_n = 0. All I can think of from here is to multiply both sides by (a_1*v_1 + a_*v_2 + ... +a_n*v_n) to get: c_1*a_1 + c_2*a_2 + ... + c_n*a_n = 0. But I think I must have made an error somewhere, because we can easily show that that equation does not imply that c_1 = ... = c_n = 0.Can somebody please point me in the right direction? Also, have I made any errors in my proof so far? Thanks.
You can think about this a lot more directly, without dragging matrices into it. You want to show if f is any element of V* then you can express it as a sum of the f_i, right? Consider the constants f(v_i)=a_i. Can you think of a way to use those to express f as a sum of the f_i? Remember if two elements of V* have the same value for all of the basis vectors, they have the same value for any vector. That shows they span. Now suppose the sum of c_i*f_i=f=0. Can you show that that implies that all of the c_i are zero? What for example is f(v_1)? That would show they are linearly independent.
OK, I think I got it. We have f(v_i) = c_i, so if we take the equation c_1*f_1 + c_2*f_2 + ... + c_n*f_n = 0 and we multiply by c_i, we get c_i = 0, thus implying that {f_1,...,f_n} is linearly independent. I'm having trouble with your proof that they span, though. When you said "f(v_i)=a_i" did you mean to write "f_i(a_i*v_i) = a_i"?
Ah, never mind. I figured it out. Thank you very much for your help. I have one more related problem: Assume that V is a finite-dimensional vector space, and V* is defined as above. Prove that dim V = dim V*. I know that I can prove this if V is defined as in the previous problem, but I'm getting tripped up by the fact that a vector space can contain things like functions and matrices.
You've shown that if v_i is a basis of V, then f_i is a basis of V*, right? There's the same number of v's as f's isn't there?