Undergrad Basis of a Subspace of a Vector Space

Click For Summary
SUMMARY

A linear vector space consists of an infinite number of vectors that adhere to specific rules, with a dimension denoted as ##N##. For any vector space, one can find a set of ##n=N## linearly independent vectors that form a basis, with multiple bases possible. Among these, there exists an orthonormal basis composed of unit vectors that are pairwise orthogonal. However, there are multiple orthonormal bases due to the ability to rotate or mirror vectors. A subspace ##B## of a vector space ##A## contains fewer vectors but is still a vector space, and it can also have an infinite number of bases, including one specific orthonormal basis.

PREREQUISITES
  • Understanding of linear vector spaces and their properties
  • Knowledge of basis vectors and linear independence
  • Familiarity with orthonormal bases and their significance
  • Concept of subspaces and their relationship to vector spaces
NEXT STEPS
  • Explore the concept of finite fields in vector spaces
  • Learn about the implications of inner products on orthonormal bases
  • Study the application of vector spaces in coding theory, specifically Hamming codes
  • Investigate the geometric interpretations of vector spaces and subspaces
USEFUL FOR

Mathematicians, physics students, computer scientists, and anyone interested in linear algebra and its applications in various fields such as coding theory and geometric interpretations.

fog37
Messages
1,566
Reaction score
108
Hello Forum and happy new year,

Aside from a rigorous definitions, a linear vector space contains an infinity of elements called vectors that must obey certain rules. Based on the dimension ##N## (finite or infinite) of the vector space, we can always find a set of ##n=N## linearly independent vectors that can form a basis. For each vector space, there is an infinity of possible bases to choose from. The basis vectors inside a particular basis don't need to be orthogonal or unit in length. Among the many many bases there is one specific basis that is orthonormal: it is composed of unit vectors that are pairwise orthogonal to each other. Among all the possible bases, only one is orthogonal and orthonormal, correct? Or are there multiple orthonormal bases?

A subspace of a vector space is also a set that contains an infinite number of vectors but it contains "less" vectors than the original host vector space. What can we say about the bases of a subspace ##B## of a vector space ##A##? For example, if the host vector space ##A## has ##N=4##, it means that:
  • Each vector in ##A## has four components: ##a = (a_{1}, a_{2}, a_{3}, a_{4})##
  • Each possible basis contains 4 linearly independent vectors
For example, if the subspace ##B## has ##N=2##, it means that each vector in the subspace has only two components while the other two components are either constant or zero since the vectors of ##B## are just the projections of the vectors of ##A## on a certain plane, correct?

Subspace ##B##, being a vector space on its own, also has an infinity of possible bases but only one specific orthonormal basis, right?

Thanks!
 
Physics news on Phys.org
fog37 said:
Hello Forum and happy new year,

Aside from a rigorous definitions, a linear vector space contains an infinity of elements called vectors that must obey certain rules. Based on the dimension ##N## (finite or infinite) of the vector space, we can always find a set of ##n=N## linearly independent vectors that can form a basis. For each vector space, there is an infinity of possible bases to choose from. The basis vectors inside a particular basis don't need to be orthogonal or unit in length. Among the many many bases there is one specific basis that is orthonormal: it is composed of unit vectors that are pairwise orthogonal to each other. Among all the possible bases, only one is orthogonal and orthonormal, correct? Or are there multiple orthonormal bases?
Multiple. You can always rotate an orthonormal basis by an arbitrary angle and have still an orthonormal basis. Or change the numbering. Or mirror single basis vectors ##\vec{b} \mapsto -\vec{b}##. Also your entire argument doesn't consider the case of finite fields.
A subspace of a vector space is also a set that contains an infinite number of vectors but it contains "less" vectors than the original host vector space. What can we say about the bases of a subspace ##B## of a vector space ##A##? For example, if the host vector space ##A## has ##N=4##, it means that:
  • Each vector in ##A## has four components: ##a = (a_{1}, a_{2}, a_{3}, a_{4})##
  • Each possible basis contains 4 linearly independent vectors
For example, if the subspace ##B## has ##N=2##, it means that each vector in the subspace has only two components while the other two components are either constant or zero...
Zero in this context, for otherwise ##\vec{0}## wouldn't be part of your subspace.
... since the vectors of ##B## are just the projections of the vectors of ##A## on a certain plane, correct?
You may look at it this way, but the inclusion is easier than the projection.
Subspace ##B##, being a vector space on its own, also has an infinity of possible bases but only one specific orthonormal basis, right?

Thanks!
No. See above: finite fields, rotations etc.
 
fog37 said:
Hello Forum and happy new year,

Aside from a rigorous definitions, a linear vector space contains an infinity of elements called vectors that must obey certain rules. Based on the dimension ##N## (finite or infinite) of the vector space, we can always find a set of ##n=N## linearly independent vectors that can form a basis. For each vector space, there is an infinity of possible bases to choose from. The basis vectors inside a particular basis don't need to be orthogonal or unit in length. Among the many many bases there is one specific basis that is orthonormal: it is composed of unit vectors that are pairwise orthogonal to each other. Among all the possible bases, only one is orthogonal and orthonormal, correct? Or are there multiple orthonormal bases?

A subspace of a vector space is also a set that contains an infinite number of vectors but it contains "less" vectors than the original host vector space. What can we say about the bases of a subspace ##B## of a vector space ##A##? For example, if the host vector space ##A## has ##N=4##, it means that:
  • Each vector in ##A## has four components: ##a = (a_{1}, a_{2}, a_{3}, a_{4})##
  • Each possible basis contains 4 linearly independent vectors
For example, if the subspace ##B## has ##N=2##, it means that each vector in the subspace has only two components while the other two components are either constant or zero since the vectors of ##B## are just the projections of the vectors of ##A## on a certain plane, correct?

Subspace ##B##, being a vector space on its own, also has an infinity of possible bases but only one specific orthonormal basis, right?

Thanks!

I will give a few remarks here:

1) It doesn't make sense to consider a vector space without specifying a field. For example, the vector space of the complex numbers with real scalars is a very different vector space than the complex numbers with complex scalars. They don't even have the same dimension! (The former has dimension 2, the latter dimension 1).

2) You say a subspace has 'less' elements. This is not nessecarily true: every vector space is a subspace of itself.
 
Here is one consequence of @fresh_42's correction (that the orthonormal basis is not unique) that may help you. Suppose you have a vector space R2 with orthonormal basis (1,0) and (0,1). The set (x,x) is a subspace with an orthonormal basis (1/√2, 1/√2). So the basis you are using for the subspace does not have to be part of the basis you are using for the larger space. But you can add to the subspace basis to get another basis for the space. In this example, (1/√2, 1/√2) and (-1/√2,1/√2) would be an orthonormal basis for the space.
 
First of all a vector space may not have an idea of length and angle in which case it makes no sense to talk of orthonomal bases. The idea of a vector space is independent of the idea of length and angle measurement. Angle and length come from an inner product. A vector space can have many inner products and for each, the idea of orthonormal is different. For instance in the plane ##R^2## the usual inner product tells you that ##(0,1)## and ##(1,0)## form an orthonormal basis.But with the inner product defined by ##(x,y).(z,w) = 4xz+4yw## they do not since they each then have length 2.

For a fixed inner product there is more than one orthonormal basis. For instance for the real line with the usual dot product, 1 and -1 are two orthonormal bases. In the plane, ##R^{2}## for each unit vector ##(cos θ, sin θ)## there are two vectors ##(cos θ+π/2,sin θ + π/2)## and ##(cos θ-π/2,sin θ -π/2)## that extend it to an orthonormal basis. Therefore the set of orthonormal bases is infinite and forms a continuum.
 
Last edited:
  • Like
Likes FactChecker and member 587159
lavinia said:
For a fixed inner product there is more than one orthonormal basis. For instance for the real line with the usual dot product, 1 and -1 are two orthonormal bases. In the plane, ##R^{2}## for each unit vector ##(cos θ, sin θ)## there are two vectors ##(cos θ+π/2,sin θ + π/2)## and ##(cos -θ-π/2,sin -θ -π/2)## that extend it to an orthonormal basis. Therefore the set of orthonormal bases is infinite and forms a continuum.
The last two pairs of vectors would be clearer with parentheses. I think this is what you meant, @lavinia:
##(\cos(\theta + \pi/2), \sin(\theta + \pi/2))## and ##(\cos(\theta - \pi/2), \sin(\theta - \pi/2))##
 
  • Like
Likes lavinia
fog37 said:
a linear vector space contains an infinity of elements called vectors that must obey certain rules.
Just to point out, the most important application of abstract vector spaces are Hamming codes which are on the spaces ##\mathbb{Z}_2^{2^n-1}## (or ##\mathbb{Z}_2^{2^n}## for SECDEC). These are finite vector spaces.
 
pwsnafu said:
Just to point out, the most important application of abstract vector spaces are Hamming codes which are on the spaces ##\mathbb{Z}_2^{2^n-1}## (or ##\mathbb{Z}_2^{2^n}## for SECDEC). These are finite vector spaces.
I would say that the application of vector spaces is too universal to single out one as "most important".
 
FactChecker said:
I would say that the application of vector spaces is too universal to single out one as "most important".

I couldn't agree more.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
4K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K