Same vector space for arbitrary independent vectors?

Click For Summary

Discussion Overview

The discussion revolves around the relationship between different sets of linearly independent vectors and the vector spaces they generate. Participants explore whether two distinct sets of linearly independent vectors can form the same vector space and the implications for bases of vector spaces, particularly in the context of real numbers.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • Some participants question whether two vector spaces formed by different sets of linearly independent vectors must be the same, suggesting that they can be distinct.
  • Others assert that any set of n linearly independent vectors can form a basis for a vector space of dimension n, referencing a theorem commonly found in linear algebra.
  • A participant discusses the simplest case of n=1, arguing that distinct linearly independent vectors span different one-dimensional subspaces that intersect trivially.
  • Concerns are raised about ensuring that a set of n linearly independent vectors does not form a vector space other than the specified vector space Q, highlighting the difficulty in guaranteeing this condition.
  • One participant presents a reasoning process showing that if a set of vectors spans a vector space Q, then any vector in Q can be expressed as a linear combination of those vectors, leading to the conclusion that the two spaces must be equal.

Areas of Agreement / Disagreement

Participants express differing views on whether two vector spaces formed by different sets of linearly independent vectors can be the same, indicating that the discussion remains unresolved. There is some agreement on the theorem regarding bases of vector spaces, but the implications of distinct sets of vectors are debated.

Contextual Notes

Participants acknowledge the complexity of ensuring that different sets of linearly independent vectors do not form distinct vector spaces, raising questions about the assumptions involved in defining vector spaces and bases.

kelvin490
Gold Member
Messages
227
Reaction score
3
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?

If we have a vector space Q that the dimension is n, can we say that any set of n linearly independent vectors k1,k2...kn can form a basis of Q? Why?

Suppose only real numbers are involved.
 
Physics news on Phys.org
kelvin490 said:
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?

I see no reason to conclude ##V## and ##S## are the same.

If we have a vector space Q that the dimension is n, can we say that any set of n linearly independent vectors k1,k2...kn can form a basis of Q? Why?

Yes. This theorem should be proven in any standard linear algebra book.
 
  • Like
Likes   Reactions: kelvin490
kelvin490 said:
If we use n linearly independent vectors x1,x2...xn to form a vector space V and use another set of n linearly independent vectors y1,y2...yn to form a vector space S, is it necessary that V and S are the same? Why?
Consider the simplest case, where ##n=1##. Then ##x_1## and ##y_1## each span a one-dimensional subspace. If ##x_1## and ##y_1## are linearly independent, then these subspaces are distinct; indeed, they intersect trivially.
 
  • Like
Likes   Reactions: kelvin490
jbunniii said:
Consider the simplest case, where ##n=1##. Then ##x_1## and ##y_1## each span a one-dimensional subspace. If ##x_1## and ##y_1## are linearly independent, then these subspaces are distinct; indeed, they intersect trivially.

Thanks. It's seems there is a theorem that a vector space Q that the dimension is n can have any set of n linearly independent vectors k1,k2...kn as its basis. How can we ensure that k1,k2...kn doesn't form a vector space other than Q but not Q?

I have this question because two different set of n linearly independent vectors can form two different vector space with dimension n. It seems hard to ensure k1,k2...kn must form Q.
 
kelvin490 said:
Thanks. It's seems there is a theorem that a vector space Q that the dimension is n can have any set of n linearly independent vectors k1,k2...kn as its basis. How can we ensure that k1,k2...kn doesn't form a vector space other than Q but not Q?

I have this question because two different set of n linearly independent vectors can form two different vector space with dimension n. It seems hard to ensure k1,k2...kn must form Q.
Let ##R## denote the vector space spanned by ##k_1,k_2,\ldots,k_n##. Since each ##k_j## is in ##Q## and ##Q## is a vector space, every linear combination of the ##k_j##'s is in ##Q##, hence ##R \subseteq Q##.

Now suppose that ##q## is an arbitrary element of ##Q##. Since ##k_1,k_2,\ldots,k_n## is a basis for ##Q##, we can write ##q = a_1 k_1 + a_2 k_2 + \cdots + a_n k_n## for some scalars ##a_1,a_2,\ldots a_n##. Therefore ##q## is contained in the subspace spanned by ##k_1,k_2,\ldots,k_n##, which is ##R##. This shows that ##Q \subseteq R##.

Since we have shown both containments ##R \subseteq Q## and ##Q \subseteq R##, we conclude that ##Q = R##.
 
  • Like
Likes   Reactions: kelvin490

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K