Simple question about the dual space

  • B
  • Thread starter etotheipi
  • Start date
  • #1
etotheipi
Gold Member
2019 Award
2,703
1,615
Rookie question; for a vector space ##V##, with basis ##v_1, v_2, \dots, v_n##, the dual space ##V^*## is the set of linear functionals ##\varphi: V \rightarrow \mathbb{R}##. Dual basis will satisfy ##\varphi^i(v_j) = \delta_{ij}##. Is the action of any dual vector on any vector always an inner product ##\varphi(v) = \langle \varphi, v \rangle##?
 
Last edited:

Answers and Replies

  • #2
Infrared
Science Advisor
Gold Member
788
408
I'm a little confused by your question: the equation ##\langle\varphi,v\rangle=\varphi(v)## cannot represent an inner product since the two inputs to ##\langle\cdot,\cdot\rangle## come from different vector spaces.

For finite-dimensional ##V##, we have that picking a basis for ##V## determines an isomorphism ##V\cong V^*## (take the basis to the corresponding dual basis). If ##x\mapsto\varphi_x## is this isomorphism taking a vector to its corresponding dual vector, then it is true ##\langle x,y\rangle=\phi_x(y)## is an inner product on ##V.## You can check that this is symmetric and positive-definite.

In fact, it is the same inner product that you get by declaring your basis to be orthonormal, and extending to all of ##V## by linearity.

Does this answer your question?
 
  • Like
Likes Abhishek11235 and etotheipi
  • #3
13,560
10,662
An inner product would require the same vector space.

In any field, e.g. ##\mathbb{R}##, we have ##0\neq 1##, hence we can always define the ##\varphi^i##, and if the vector space is finite dimensional, those linear forms span ##V^*##. It's getting interesting if the dimension isn't finite anymore. I assume that we will need the axiom of choice a lot, and I don't see how definiteness will work.
 
  • Like
Likes etotheipi
  • #4
mathwonk
Science Advisor
Homework Helper
11,041
1,231
If V,W are finite dimensional k-vector spaces of the same dimension, a function < , >:VxW-->k is called a non singular pairing if it is symmetric, bilinear, and "positive definite". (By that I mean that the only w that pairs to zero with every v is w=0, and vice versa.) There is always a natural such pairing if W = V*, as you know. In fact, this is essentially the only case where there is such a pairing, in the following sense: if VxW-->k is a non singular pairing, then the map taking a vector w in W to the linear function < ,w>:V-->k, on V, is an isomorphism from W to V*.

If W = V, a non singular pairing VxV-->k is called an inner product on V, and thus defines an isomorphism from V to V*. Conversely, any isomorphism from V to V*, defines a non singular pairing, i.e. an inner product, VxV-->k.

edit: oops, this doesn't make sense. what do i mean by "symmetric" for a pairing between different spaces. I need to think about this and repost tomorrow after I get some sleep on it. shows what happens when i claim "plausible" things without proving them.

maybe i just want bilinear and non singular ("positive definite"). but then why would it be symmetric when V=W? maybe it won't have to be. but the point is supposed to be that an inner product is a special case of a bilinear pairing, and defines an isomorphism with the dual space. but where does symmetry come in?
 
Last edited:
  • Like
Likes etotheipi
  • #5
etotheipi
Gold Member
2019 Award
2,703
1,615
For finite-dimensional ##V##, we have that picking a basis for ##V## determines an isomorphism ##V\cong V^*## (take the basis to the corresponding dual basis). If ##x\mapsto\varphi_x## is this isomorphism taking a vector to its corresponding dual vector, then it is true ##\langle x,y\rangle=\phi_x(y)## is an inner product on ##V.## You can check that this is symmetric and positive-definite.
Ah, okay, I think I understand. For something like a Hilbert space ##V## and its dual ##V^*##, we have an isomorphism ##u \mapsto \varphi_u##. Then, ##\varphi_u(v)## is the same as taking the inner product of ##u## with the vector ##v##, both of which lie in the space ##V##; i.e. ##\varphi_u(v) = \langle u | v \rangle##. I do suppose this is limited to finite dimensional spaces, and we of course need ##V## to be an inner product space.
 
  • #6
Infrared
Science Advisor
Gold Member
788
408
Well first of call, I wouldn't use the term "Hilbert space" when you only mean to refer to finite-dimensional vector spaces.

Anyway, I think you're still mixing some concepts.

If you have a basis ##v_1,\ldots,v_n## for ##V##, then you have a corresponding dual basis ##\varphi^1,\ldots,\varphi^n## for ##V^*,## defined by the rule ##\varphi^i(v_j)=\delta^i_j,## and by linearity you get an isomorphism ##V\to V^*, v\mapsto \varphi_v.## You do not need an inner product to start with to define this. If you already have an inner product ##\langle\cdot,\cdot\rangle## on ##V##, there is no guarantee that ##\langle u,v\rangle=\varphi_u(v).## The RHS is an inner product, but might have no relation to the inner product you start with (indeed, it is the inner product you get from demanding that your basis ##v_1,\ldots,v_n## is orthonormal).

On the other hand if you do start with a finite-dimensional inner product space ##(V,\langle\cdot,\cdot\rangle)##, then we can define an isomorphism ##V\to V^*, v\mapsto \varphi_v## by the rule ##\varphi_v(u)=\langle u,v\rangle.## This map is more natural than the map in the previous paragraph (also called ##\varphi##) since it doesn't depend on a choice of basis. And then you equation you want is true by definition.


maybe i just want bilinear and non singular ("positive definite"). but then why would it be symmetric when V=W? maybe it won't have to be. but the point is supposed to be that an inner product is a special case of a bilinear pairing, and defines an isomorphism with the dual space. but where does symmetry come in?
Forgive me if I'm only saying things that you already know.
Positive-definiteness of a (symmetric is usually assumed) bilinear form usually means that ##(x,x)\geq 0## with equality only for ##x=0##, which is stronger than non-degeneracy. You're also right that symmetry doesn't follow from bilinearity and non-degeneracy. Thinking concretely, the form ##(x,y):=x\cdot Ay## on ##\mathbb{R}^n## is non-degenerate when ##A## is nonsingular, (anti-) symmetric when ##A## is, and positive-definite when ##A## is symmetric with positive eigenvalues.
 
  • Informative
Likes etotheipi
  • #7
etotheipi
Gold Member
2019 Award
2,703
1,615
If you have a basis ##v_1,\ldots,v_n## for ##V##, then you have a corresponding dual basis ##\varphi^1,\ldots,\varphi^n## for ##V^*,## defined by the rule ##\varphi^i(v_j)=\delta^i_j,## and by linearity you get an isomorphism ##V\to V^*, v\mapsto \varphi_v.## You do not need an inner product to start with to define this. If you already have an inner product ##\langle\cdot,\cdot\rangle## on ##V##, there is no guarantee that ##\langle u,v\rangle=\varphi_u(v).## The RHS is an inner product, but might have no relation to the inner product you start with (indeed, it is the inner product you get from demanding that your basis ##v_1,\ldots,v_n## is orthonormal).

On the other hand if you do start with a finite-dimensional inner product space ##(V,\langle\cdot,\cdot\rangle)##, then we can define an isomorphism ##V\to V^*, v\mapsto \varphi_v## by the rule ##\varphi_v(u)=\langle u,v\rangle.## This map is more natural than the map in the previous paragraph (also called ##\varphi##) since it doesn't depend on a choice of basis. And then you equation you want is true by definition.
Thanks for taking the time to explain! What you wrote makes sense. I think I'll need to study a bit more before asking any more questions :smile:
 
  • #8
Math_QED
Science Advisor
Homework Helper
2019 Award
1,701
720
Ah, okay, I think I understand. For something like a Hilbert space ##V## and its dual ##V^*##, we have an isomorphism ##u \mapsto \varphi_u##. Then, ##\varphi_u(v)## is the same as taking the inner product of ##u## with the vector ##v##, both of which lie in the space ##V##; i.e. ##\varphi_u(v) = \langle u | v \rangle##. I do suppose this is limited to finite dimensional spaces, and we of course need ##V## to be an inner product space.
This isomorphism holds for every real Hilbert space, also infinite dimensional vector spaces, but you consider the topological dual then instead of the algzbraic one (in the finite dimensional case they coincide). I think this result is called the Riesz theorem.

For complex vector spaces, the map does not preserve scalar multiplication anymore but changes a scalar in its conjugate.
 
  • Like
Likes etotheipi
  • #9
etotheipi
Gold Member
2019 Award
2,703
1,615
This isomorphism holds for every real Hilbert space, also infinite dimensional vector spaces, but you consider the topological dual then instead of the algzbraic one (in the finite dimensional case they coincide). I think this result is called the Riesz theorem.

For complex vector spaces, the map does not preserve scalar multiplication anymore but changes a scalar in its conjugate.
Thanks! Funnily enough, I was trying to learn this from Chapter 3, however when I searched for "Riesz" I learned that Axler goes into a lot more detail in Chapter 6 (page 188 in my book). He says exactly what yourself and @Infrared did,

"Riesz Representation Theorem: Suppose ##V## is finite-dimensional and ##\varphi## is a linear functional on ##V##. Then there is a unique vector ##u \in V## s.t. $$\varphi(v) = \langle v, u \rangle$$for every ##v \in V##"

and the proof is given below. Interesting about the complex vector spaces, though! I'll see if I can find anything about that in the book... thanks!
 
  • Like
Likes Math_QED
  • #10
Math_QED
Science Advisor
Homework Helper
2019 Award
1,701
720
Thanks! Funnily enough, I was trying to learn this from Chapter 3, however when I searched for "Riesz" I learned that Axler goes into a lot more detail in Chapter 6 (page 188 in my book). He says exactly what yourself and @Infrared did,

"Riesz Representation Theorem: Suppose ##V## is finite-dimensional and ##\varphi## is a linear functional on ##V##. Then there is a unique vector ##u \in V## s.t. $$\varphi(v) = \langle v, u \rangle$$for every ##v \in V##"

and the proof is given below. Interesting about the complex vector spaces, though! I'll see if I can find anything about that in the book... thanks!
You can verify it for yourself. Assuming the inner product is linear in the first component (as Axler's version of Riesz seems to suggest), we have ##\varphi_{\lambda u}=\overline{\lambda}\varphi_u##. Indeed, given ##v\in V##, we have
$$\varphi_{\lambda u}(v)= \langle v,\lambda u\rangle =\overline{\lambda}\langle v,u\rangle =\overline{\lambda}\varphi_u(v)$$ from which the claim follows.
 
  • Like
Likes etotheipi
  • #11
etotheipi
Gold Member
2019 Award
2,703
1,615
You can verify it for yourself. Assuming the inner product is linear in the first component (as Axler's version of Riesz seems to suggest), we have ##\varphi_{\lambda u}=\overline{\lambda}\varphi_u##. Indeed, given ##v\in V##, we have
$$\varphi_{\lambda u}(v)= \langle v,\lambda u\rangle =\overline{\lambda}\langle v,u\rangle =\overline{\lambda}\varphi_u(v)$$ from which the claim follows.
Thanks! I did have a follow-up question... why does the inner product need to be linear in one argument but conjugate-linear in the other? And furthermore, is it purely by convention linear in the first argument and conjugate-linear in the second, because I don't see any reason why it couldn't be defined to be the other way around?
 
  • #12
Infrared
Science Advisor
Gold Member
788
408
Suppose that your form is linear in both components. Then ##\langle iv,iv\rangle=-\langle v,v\rangle ## would pose a problem for the condition ##\langle x,x\rangle\geq 0## for all ##x##. For example, think about the standard form ##\langle x,y\rangle=\sum x_i\overline{y_i}## in ##\mathbb{C}^n##.

It is a matter of convention which component you choose to be conjugate-linear.
 
  • Like
Likes etotheipi and Math_QED
  • #13
Math_QED
Science Advisor
Homework Helper
2019 Award
1,701
720
Thanks! I did have a follow-up question... why does the inner product need to be linear in one argument but conjugate-linear in the other? And furthermore, is it by convention always linear in the first argument and conjugate-linear in the second, because I don't see any reason why it couldn't be the other way around?
It is a convention. It can be the other way around as well. I have seen both conventions being used, although I feel linearity in the first component is more popular for mathematicians.

The answer of your other question is that bilinearity and positive definiteness don't go well together over the complex field, as @Infrared excellently demonstrated.
 
  • Like
Likes etotheipi
  • #14
etotheipi
Gold Member
2019 Award
2,703
1,615
Suppose that your form is linear in both components. Then ##\langle iv,iv\rangle=-\langle v,v\rangle ## would pose a problem for the condition ##\langle x,x\rangle\geq 0## for all ##x##.
That's a neat counter-example! Now I understand why it needs to take the form ##\langle x,y\rangle=\sum x_i\overline{y_i}##.

It is a convention. It can be the other way around as well. I have seen both conventions being used, although I feel linearity in the first component is more popular by mathematicians.
Cool, in any case I suppose it's probably clear from context.

I've run out of questions... thanks for your help! You guys are awesome :smile:
 
  • Like
Likes Infrared and Math_QED
  • #15
Well first of call, I wouldn't use the term "Hilbert space" when you only mean to refer to finite-dimensional vector spaces.
I think that, mathematically, a Hilbert space can be finite dimensional. I had never seen a physical system that had a finite dimensional Hilbert space until just a few years ago. But a professor of mathematics writing mathematical physics models started using one to solve new physics problems. It took me a while to understand it and then the technique seemed brilliant. If you want a challenge, look at causal-fermion-system.com Very few physicists seem to be fluent or even aware of this re-framing of physics in these new mathematical forms. Everyone, even here in southern California, that I have talked to just says that it is incomprehensible or not mainstream or "not my specialty."
 
Last edited by a moderator:
  • #16
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
14,710
6,954
It is a convention. It can be the other way around as well. I have seen both conventions being used, although I feel linearity in the first component is more popular for mathematicians.
The Dirac notation relies on linearity in the second argument, which is why physicists prefer that!
 
  • Like
Likes Math_QED, fresh_42 and etotheipi
  • #17
Infrared
Science Advisor
Gold Member
788
408
@Physics4Funn

Yes, of course a Hilbert space ##\textit{can}## be finite-dimensional, but there's no reason to say "Hilbert space" as opposed to just "inner product space" when your spaces are necessarily finite-dimensional, since then the extra condition of completeness is vacuous.

Anyway, I don't know enough physics to comment on the rest of your post.
 
  • Like
Likes Math_QED
  • #18
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
3,898
171
@Physics4Funn

Yes, of course a Hilbert space ##\textit{can}## be finite-dimensional, but there's no reason to say "Hilbert space" as opposed to just "inner product space" when your spaces are necessarily finite-dimensional, since then the extra condition of completeness is vacuous.
Hilbert space is much shorter to say and write.
 

Related Threads on Simple question about the dual space

  • Last Post
Replies
16
Views
3K
  • Last Post
2
Replies
48
Views
4K
  • Last Post
Replies
12
Views
2K
  • Last Post
Replies
20
Views
5K
  • Last Post
Replies
4
Views
5K
  • Last Post
Replies
2
Views
4K
Replies
2
Views
4K
  • Last Post
2
Replies
25
Views
7K
  • Last Post
Replies
8
Views
1K
Top