simpleton
- 56
- 0
Hi, I'm learning about vector spaces and I would like to ask some questions about it.
Suppose I have a vector space V, and a basis for V \{v_1, ... v_n\}. Then there is a dual space V^* consisting all linear functions whose domain is V and range is ℝ. Then the space V^* has a dual basis \{x_1 ... x_n\}, and one way of constructing this dual basis is to let x_i be a function that returns the i^{th} coordinate of the vector when expressed in the given basis for V. The claim is that the span of these functions is equal to V^*, and therefore the space and dual space have the same dimension.
I have a few questions. My textbook states that any linear function on V can be expressed as a linear combination of the coordinate functions, but it does not explain why. May I know why this is the case? I am assuming that this is because a linear function on a vector must be a linear combination of its coordinates, but I'm not sure if this must always be the case, since I think a linear function is merely defined to be something that is closed under addition and multiplication under scalars.
The textbook also says that the coordinate functions are linear independent. By this, I think they mean that if there exists a linear combination of the coordinate functions such that any input vector returns 0 on the equation, then the coefficients of all the coordinate functions must be 0. I think this makes sense, but may I know if I interpreted it correctly?
Finally, it is mentioned that given any dual basis \{x_1...x_n\} of V^*, you can construct a basis for V, such that the dual basis acts as coordinate functions for the basis of V. By this, I think they mean that I can find a set \{v_1...v_n\} such that x_i(v_j) = 1 iff i = j and x_i(v_j) = 0 otherwise. But I'm not sure why this is true. I think that in order for this to be true, the following questions have to be answered.
1) Such a set \{v_1...v_n\} exists.
2) Any vector in V can be expressed as a linear combination of the vectors.
3) The constructed vectors are linearly independent.
I am not sure how 1) can be answered. I think 2) and 3) are actually the same question because we know that both V and V^* have the same dimension, and therefore both 2) and 3) have to be true at the same time so that the Linear Dependence Lemma is not violated. I am not sure how to prove 2), but I think I can prove 3). Suppose a linear combination exists, so \Sigma a_iv_i = 0 for some a_1...a_n. But if I apply x_i to this linear combination, I will get a_i = 0 so in the end I deduce that all the coefficients are 0. May I know if there is a way to prove 2) directly?
Thank you very much!
Suppose I have a vector space V, and a basis for V \{v_1, ... v_n\}. Then there is a dual space V^* consisting all linear functions whose domain is V and range is ℝ. Then the space V^* has a dual basis \{x_1 ... x_n\}, and one way of constructing this dual basis is to let x_i be a function that returns the i^{th} coordinate of the vector when expressed in the given basis for V. The claim is that the span of these functions is equal to V^*, and therefore the space and dual space have the same dimension.
I have a few questions. My textbook states that any linear function on V can be expressed as a linear combination of the coordinate functions, but it does not explain why. May I know why this is the case? I am assuming that this is because a linear function on a vector must be a linear combination of its coordinates, but I'm not sure if this must always be the case, since I think a linear function is merely defined to be something that is closed under addition and multiplication under scalars.
The textbook also says that the coordinate functions are linear independent. By this, I think they mean that if there exists a linear combination of the coordinate functions such that any input vector returns 0 on the equation, then the coefficients of all the coordinate functions must be 0. I think this makes sense, but may I know if I interpreted it correctly?
Finally, it is mentioned that given any dual basis \{x_1...x_n\} of V^*, you can construct a basis for V, such that the dual basis acts as coordinate functions for the basis of V. By this, I think they mean that I can find a set \{v_1...v_n\} such that x_i(v_j) = 1 iff i = j and x_i(v_j) = 0 otherwise. But I'm not sure why this is true. I think that in order for this to be true, the following questions have to be answered.
1) Such a set \{v_1...v_n\} exists.
2) Any vector in V can be expressed as a linear combination of the vectors.
3) The constructed vectors are linearly independent.
I am not sure how 1) can be answered. I think 2) and 3) are actually the same question because we know that both V and V^* have the same dimension, and therefore both 2) and 3) have to be true at the same time so that the Linear Dependence Lemma is not violated. I am not sure how to prove 2), but I think I can prove 3). Suppose a linear combination exists, so \Sigma a_iv_i = 0 for some a_1...a_n. But if I apply x_i to this linear combination, I will get a_i = 0 so in the end I deduce that all the coefficients are 0. May I know if there is a way to prove 2) directly?
Thank you very much!