Same vector space for arbitrary independent vectors?

Click For Summary
SUMMARY

The discussion centers on the properties of vector spaces formed by linearly independent vectors. It establishes that two vector spaces V and S, each formed by n linearly independent vectors, are not necessarily the same. Furthermore, it confirms that any set of n linearly independent vectors can indeed form a basis for a vector space Q of dimension n, as supported by standard linear algebra theorems. The discussion emphasizes the distinctness of vector spaces even when they share the same dimension.

PREREQUISITES
  • Understanding of vector spaces and linear independence
  • Familiarity with the concept of basis in linear algebra
  • Knowledge of the dimension of vector spaces
  • Basic principles of linear combinations
NEXT STEPS
  • Study the properties of vector spaces in linear algebra textbooks
  • Learn about the concept of basis and dimension in vector spaces
  • Explore the implications of linear independence on vector space structure
  • Investigate theorems related to spanning sets and bases in linear algebra
USEFUL FOR

Students of linear algebra, mathematicians, and educators seeking to deepen their understanding of vector spaces and their properties.

kelvin490
Gold Member
Messages
227
Reaction score
3
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?

If we have a vector space Q that the dimension is n, can we say that any set of n linearly independent vectors k1,k2...kn can form a basis of Q? Why?

Suppose only real numbers are involved.
 
Physics news on Phys.org
kelvin490 said:
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?

I see no reason to conclude ##V## and ##S## are the same.

If we have a vector space Q that the dimension is n, can we say that any set of n linearly independent vectors k1,k2...kn can form a basis of Q? Why?

Yes. This theorem should be proven in any standard linear algebra book.
 
  • Like
Likes   Reactions: kelvin490
kelvin490 said:
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?
Consider the simplest case, where ##n=1##. Then ##x_1## and ##y_1## each span a one-dimensional subspace. If ##x_1## and ##y_1## are linearly independent, then these subspaces are distinct; indeed, they intersect trivially.
 
  • Like
Likes   Reactions: kelvin490
jbunniii said:
Consider the simplest case, where ##n=1##. Then ##x_1## and ##y_1## each span a one-dimensional subspace. If ##x_1## and ##y_1## are linearly independent, then these subspaces are distinct; indeed, they intersect trivially.

Thanks. It's seems there is a theorem that a vector space Q that the dimension is n can have any set of n linearly independent vectors k1,k2...kn as its basis. How can we ensure that k1,k2...kn doesn't form a vector space other than Q but not Q?

I have this question because two different set of n linearly independent vectors can form two different vector space with dimension n. It seems hard to ensure k1,k2...kn must form Q.
 
kelvin490 said:
Thanks. It's seems there is a theorem that a vector space Q that the dimension is n can have any set of n linearly independent vectors k1,k2...kn as its basis. How can we ensure that k1,k2...kn doesn't form a vector space other than Q but not Q?

I have this question because two different set of n linearly independent vectors can form two different vector space with dimension n. It seems hard to ensure k1,k2...kn must form Q.
Let ##R## denote the vector space spanned by ##k_1,k_2,\ldots,k_n##. Since each ##k_j## is in ##Q## and ##Q## is a vector space, every linear combination of the ##k_j##'s is in ##Q##, hence ##R \subseteq Q##.

Now suppose that ##q## is an arbitrary element of ##Q##. Since ##k_1,k_2,\ldots,k_n## is a basis for ##Q##, we can write ##q = a_1 k_1 + a_2 k_2 + \cdots + a_n k_n## for some scalars ##a_1,a_2,\ldots a_n##. Therefore ##q## is contained in the subspace spanned by ##k_1,k_2,\ldots,k_n##, which is ##R##. This shows that ##Q \subseteq R##.

Since we have shown both containments ##R \subseteq Q## and ##Q \subseteq R##, we conclude that ##Q = R##.
 
  • Like
Likes   Reactions: kelvin490

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K