Same vector space for arbitrary independent vectors?

In summary: Let ##R## denote the vector space spanned by ##k_1,k_2,\ldots,k_n##. Since each ##k_j## is in ##Q## and ##Q## is a vector space, every linear combination of the ##k_j##'s is in ##Q##, hence ##R \subseteq Q##.Now suppose that ##q## is an arbitrary element of ##Q##. Since ##k_1,k_2,\ldots,k_n## is a basis for ##Q##, we can write ##q = a_1 k_1 + a_2 k_2 + \cdots + a_n
  • #1
kelvin490
Gold Member
228
3
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?

If we have a vector space Q that the dimension is n, can we say that any set of n linearly independent vectors k1,k2...kn can form a basis of Q? Why?

Suppose only real numbers are involved.
 
Physics news on Phys.org
  • #2
kelvin490 said:
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?

I see no reason to conclude ##V## and ##S## are the same.

If we have a vector space Q that the dimension is n, can we say that any set of n linearly independent vectors k1,k2...kn can form a basis of Q? Why?

Yes. This theorem should be proven in any standard linear algebra book.
 
  • Like
Likes kelvin490
  • #3
kelvin490 said:
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?
Consider the simplest case, where ##n=1##. Then ##x_1## and ##y_1## each span a one-dimensional subspace. If ##x_1## and ##y_1## are linearly independent, then these subspaces are distinct; indeed, they intersect trivially.
 
  • Like
Likes kelvin490
  • #4
jbunniii said:
Consider the simplest case, where ##n=1##. Then ##x_1## and ##y_1## each span a one-dimensional subspace. If ##x_1## and ##y_1## are linearly independent, then these subspaces are distinct; indeed, they intersect trivially.

Thanks. It's seems there is a theorem that a vector space Q that the dimension is n can have any set of n linearly independent vectors k1,k2...kn as its basis. How can we ensure that k1,k2...kn doesn't form a vector space other than Q but not Q?

I have this question because two different set of n linearly independent vectors can form two different vector space with dimension n. It seems hard to ensure k1,k2...kn must form Q.
 
  • #5
kelvin490 said:
Thanks. It's seems there is a theorem that a vector space Q that the dimension is n can have any set of n linearly independent vectors k1,k2...kn as its basis. How can we ensure that k1,k2...kn doesn't form a vector space other than Q but not Q?

I have this question because two different set of n linearly independent vectors can form two different vector space with dimension n. It seems hard to ensure k1,k2...kn must form Q.
Let ##R## denote the vector space spanned by ##k_1,k_2,\ldots,k_n##. Since each ##k_j## is in ##Q## and ##Q## is a vector space, every linear combination of the ##k_j##'s is in ##Q##, hence ##R \subseteq Q##.

Now suppose that ##q## is an arbitrary element of ##Q##. Since ##k_1,k_2,\ldots,k_n## is a basis for ##Q##, we can write ##q = a_1 k_1 + a_2 k_2 + \cdots + a_n k_n## for some scalars ##a_1,a_2,\ldots a_n##. Therefore ##q## is contained in the subspace spanned by ##k_1,k_2,\ldots,k_n##, which is ##R##. This shows that ##Q \subseteq R##.

Since we have shown both containments ##R \subseteq Q## and ##Q \subseteq R##, we conclude that ##Q = R##.
 
  • Like
Likes kelvin490

1. What is a vector space?

A vector space is a mathematical structure that consists of a set of vectors that can be added and multiplied by scalars. It is a fundamental concept in linear algebra and is used to model many physical and mathematical systems.

2. What does it mean for vectors to be independent?

Vectors are considered independent if no vector in the set can be expressed as a linear combination of the other vectors. In other words, none of the vectors can be written as a scalar multiple of another vector in the set.

3. Why is it important for vectors to be independent in the same vector space?

If vectors are not independent in the same vector space, it means that at least one of the vectors can be expressed as a linear combination of the others. This can lead to redundancies and inaccuracies in mathematical models and calculations.

4. Can any set of vectors be considered independent in a vector space?

No, for a set of vectors to be considered independent in a vector space, they must satisfy certain conditions such as being linearly independent and spanning the entire space. If these conditions are not met, the vectors cannot be considered independent in the same vector space.

5. How do you determine if vectors are independent in the same vector space?

To determine if vectors are independent in the same vector space, you can use methods such as Gaussian elimination or calculating the determinant of the matrix formed by the vectors. If the determinant is non-zero, the vectors are independent in the same vector space.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
883
  • Linear and Abstract Algebra
Replies
4
Views
882
  • Linear and Abstract Algebra
Replies
8
Views
882
Replies
1
Views
235
  • Linear and Abstract Algebra
Replies
9
Views
578
  • Linear and Abstract Algebra
Replies
7
Views
253
  • Linear and Abstract Algebra
Replies
3
Views
302
  • Linear and Abstract Algebra
Replies
9
Views
203
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top