How Do You Construct the Dual Basis in a Vector Space?

Click For Summary
SUMMARY

This discussion focuses on constructing the dual basis in a vector space, specifically addressing the relationship between linear transformations and the Kronecker delta function. The dual space, denoted as V* = Hom(V, K), is defined for a vector space V over a field K. Participants explore the mapping of basis vectors vi* such that vi*(vj) = d_ij, with insights into using Gram-Schmidt orthonormalization and defining transformations that satisfy linearity. The conversation emphasizes the importance of understanding linear transformations within the context of Hom(V, W).

PREREQUISITES
  • Understanding of vector spaces and linear transformations
  • Familiarity with the dual space concept (V* = Hom(V, K))
  • Knowledge of the Kronecker delta function
  • Basic principles of the Gram-Schmidt orthonormalization process
NEXT STEPS
  • Study the properties of linear transformations in Hom(V, W)
  • Learn about the Gram-Schmidt orthonormalization process in detail
  • Explore the concept of linear functionals and their applications
  • Investigate the role of standard bases in vector space representations
USEFUL FOR

Students of linear algebra, mathematicians interested in functional analysis, and educators teaching vector space theory will benefit from this discussion.

PsychonautQQ
Messages
781
Reaction score
10

Homework Statement


Let Hom(V,W) be the set of linear transformations from V to W. Define addition on Hom(V,W) by (f + g)(v) = f(v) + g(v) and scalar multiplication by (af)(v) = af(v.

If V is a vector space over a field K, define V* = Hom(V,K). This is called the dual space of V. If <v1,...,vn> is a basis of V, show that for each I there is an vi* that is an element of V* satisfying vi*(vj) = d_ij (Kronecker delta function).

*Hopefully I typed this question out clear enough, I'm hoping that a real maverick of linear algebra happens upon this thread and understands the question*

Homework Equations




The Attempt at a Solution


I've put a lot of thought into this problem and wanted to discuss my thoughts with someone who knows what's going on. So given vi* is an element of V*, we need find what mapping vi* must be such that vi*(vj) = d_ij (Kronecker delta function).

what if vi*(vj) = ( vj / |vj| )e_i where e_i is a unit vector in the I direction. The (vj / |vj|) will equal one, since this is the equation for normalizing a vector, and then multiplying this by e_i will be zero if j does not equal I.

But then I realized that multiplying by e_i will be zero only if I apply grand Schmitt's orthonormalization process first, right?

Anyone, I'm knew to the subject and hope somebody can somewhat understand my thoughts and provide some insight! Thanks PF
 
Physics news on Phys.org
PsychonautQQ said:

Homework Statement


Let Hom(V,W) be the set of linear transformations from V to W. Define addition on Hom(V,W) by (f + g)(v) = f(v) + g(v) and scalar multiplication by (af)(v) = af(v.

If V is a vector space over a field K, define V* = Hom(V,K). This is called the dual space of V. If <v1,...,vn> is a basis of V, show that for each I there is an vi* that is an element of V* satisfying vi*(vj) = d_ij (Kronecker delta function).

*Hopefully I typed this question out clear enough, I'm hoping that a real maverick of linear algebra happens upon this thread and understands the question*

Homework Equations

The Attempt at a Solution


I've put a lot of thought into this problem and wanted to discuss my thoughts with someone who knows what's going on. So given vi* is an element of V*, we need find what mapping vi* must be such that vi*(vj) = d_ij (Kronecker delta function).

what if vi*(vj) = ( vj / |vj| )e_i where e_i is a unit vector in the I direction. The (vj / |vj|) will equal one, since this is the equation for normalizing a vector, and then multiplying this by e_i will be zero if j does not equal I.

But then I realized that multiplying by e_i will be zero only if I apply grand Schmitt's orthonormalization process first, right?

Anyone, I'm knew to the subject and hope somebody can somewhat understand my thoughts and provide some insight! Thanks PF

If ##v = \sum_{i=1}^n c_i v_i##, think about defining ##v_j^*## as picking out the ##j##th coefficient:$$
v_j^*(v) = c_j$$See if you can do something with that.
 
  • Like
Likes   Reactions: PsychonautQQ
Your thinking is reasonable. It depends on having a "standard basis" {e1, e2,...eN} available. Some proofs can be effectively be done by thinking of the abstract vector space as being Euclidean space in a diagram where the illustrator has set up {e1,e2,..eN} for us. However, statistically speaking, abstractly phrased problems more often expect students to abstract methods if they do the exercise "the easy way"

My thought on this problem is that you should define a transformation P that has the desired properties, without initially claiming it is a linear transformation. Then prove P is a linear transformation. According to the traditions of how writing mathematics is interpreted, the definition:

Let Hom(V,W) be the set of linear transformations from V to W.

Implies that any linear transformation you create from V into W is an element of Hom(V,W). So if you can make up a linear transformation P with the desired properties, it will be an element in Hom(V,W).

Give the vector vj in V, you could say "Let Pj be the transformation Pj(vi) = d_ij". However, this does not "well define" Pj since it doesn't tell what Pj does to an arbitrary vector R. To "well define" Pj(R) you could say that Pj is defined as "Express R as its unique linear combination of the vectors in the basis {v1, v2,...vN} and let Pj(R) be the coefficient of vj in that linear combination". (I'm sure you can find a more dignifed way of putting it.) You have to prove Pj is a linear transformation. (Pj is "projection on the j-direction" if you are thinking of Euclidean space, but I wouldn't use the word "direction" when you write the proof.)

It interesting that your course materials, within a single problem, define Hom(V,W) as a set of linear transformations between vector spaces and then specialize to the case Hom(V,K) where K is the scalar field. The special case of Hom(V,K) is very important. The linear transformations of Hom(V,K) can be called "linear functionals". You can think of certain sets of functions as vector spaces. In advanced calculus, a linear functional is defined as linear function that maps functions to numbers. (For example L(f(x)) = \int_0^\infty x f(x) dx ).

There are also examples of Hom(V,W) in advanced calculus - many of those mysterious "transforms" that take a function f(x) to a function F(s) in a different variable are linear transformations from one vector space of functions to another vector space of functions.
 
Last edited:
  • Like
Likes   Reactions: PsychonautQQ

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
9
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K