Graduate Isomorphism between a linear space and its dual

Click For Summary
The discussion revolves around proving the bijectivity of a linear map between a finite-dimensional vector space and its dual. The map is defined as f(x) = (x|·), which is linear in the first argument and conjugate linear in the second. The user successfully demonstrated that f being bijective implies a specific inner product condition, but struggles with proving surjectivity. A key insight is that the proof hinges on the dimensionality of the spaces, where both X and X* have the same dimension, allowing the conclusion that injectivity implies surjectivity. The conversation also touches on the impact of different conventions for defining the inner product on the proof's structure.
Geofleur
Science Advisor
Gold Member
Messages
426
Reaction score
177
I have been trying to prove the following theorem, for a finite dimensional vector space ## X ## and its dual ## X^* ##:

Let ## f:X\rightarrow X^* ## be given by ## f(x) = (x|\cdot) ##, where ## (x|\cdot) ## is linear in the first argument and conjugate linear in the second (so I am using the mathematicians' convention here). Then ## f ## is bijective if and only if [## (y|x)=0 ## for all ## x \in X ## implies that ## y = 0 ##].

I have been able to show the "only if" part - that ## f ## being bijective implies the statement about the inner product. For the "if" part of the proof (where we assume that [## (y|x)=0 ## for every ## x \in X ## implies ## y = 0 ##] and prove that ## f ## is bijective), I am able to prove that ## f ## is injective, but I am running into a problem in proving that ## f ## is surjective. Here is my attempt:

Let ##x^*## be an arbitrary linear functional in ## X^*##. Because the space ## X ## is finite dimensional, we may choose an orthonormal basis ## \{e_i\}_{i=1}^n ## and write an arbitrary vector in ## X ## as ## x = \sum_{i=1}^{n}x_i e_i ##. Then we have ## x^*(x) = x^*(\sum_{i=1}^{n} x_i e_i) = \sum_{i=1}^{n}x_i x^*(e_i) ##. Now we need to find a ## y \in X ## such that ## f(y) = x^*##, i.e., such that ## (y|x) = x^*(x) ## for all ## x \in X ##. For any ## y \in X ## we can write ## y = \sum_{j=1}^{n} y_j e_j ##, so maybe choosing the ## y_j ## appropriately will do the job. We have ## (y|x) = (\sum_{j=1}^{n} y_j e_j | \sum_{i=1}^{n} x_i e_i) = \sum_{j} \sum_{i} y_j \overline{x}_i (e_j|e_i) = \sum_{i} y_i \overline{x}_i ##. The problem is that I want this to equal ## x^*(x) = \sum_{i=1}^{n}x_i x^*(e_i) ##, but the former expression involves ## \overline{x}_i ## while the latter involves ## x_i ##. If I had used the physicists' convention and made the inner product conjugate linear in the first argument instead, this problem would not have occurred. But it seems strange that a theorem like this would be dependent upon a convention like that. Any insights?
 
Physics news on Phys.org
Isn't the problem that in ##(y|x) = x^*(x)##, the LHS is conjugate linear in x, while the RHS is linear in x?

In other words, shouldn't f be defined as ##f(x) = (\cdot|x)##?
 
  • Like
Likes Geofleur
Another is by saying ##X## and ##X^*## have the same dimension.
 
@Samy_A: That makes sense - it means I would need to make a correction in the book I am reading, though (Analysis, Manifolds, and Physics, Choquet-Bruhat).

@micromass: Do you mean I was wrongfully assuming that ## X ## and ## X^* ## have the same dimension? I thought that all I had assumed is that the action of ## x^* ## is completely determined by its effect on the basis vectors of ## X ##. Did I sneak something else in without realizing it?
 
Geofleur said:
@Samy_A: That makes sense - it means I would need to make a correction in the book I am reading, though (Analysis, Manifolds, and Physics, Choquet-Bruhat).

@micromass: Do you mean I was wrongfully assuming that ## X ## and ## X^* ## have the same dimension? I thought that all I had assumed is that the action of ## x^* ## is completely determined by its effect on the basis vectors of ## X ##. Did I sneak something else in without realizing it?

No, it's correct. In finite dimensions, ##X## and ##X^*## have the same dimension. But once you know that, the proof is done. See the alternative theorem. This says that a linear map between two spaces of the same finite dimension is an isomorphism iff it is surjective iff it is injective.
 
  • Like
Likes Geofleur
Ah - I understand! Thanks :-D
 
If you're interested in quantum mechanics or functional analysis, you probably will want to think about how much this result generalizes to infinite dimensions...
 
Geofleur said:
I have been trying to prove the following theorem, for a finite dimensional vector space ## X ## and its dual ## X^* ##:

Let ## f:X\rightarrow X^* ## be given by ## f(x) = (x|\cdot) ##, where ## (x|\cdot) ## is linear in the first argument and conjugate linear in the second (so I am using the mathematicians' convention here). Then ## f ## is bijective if and only if [## (y|x)=0 ## for all ## x \in X ## implies that ## y = 0 ##].

I have been able to show the "only if" part - that ## f ## being bijective implies the statement about the inner product. For the "if" part of the proof (where we assume that [## (y|x)=0 ## for every ## x \in X ## implies ## y = 0 ##] and prove that ## f ## is bijective), I am able to prove that ## f ## is injective, but I am running into a problem in proving that ## f ## is surjective. Here is my attempt:

Let ##x^*## be an arbitrary linear functional in ## X^*##. Because the space ## X ## is finite dimensional, we may choose an orthonormal basis ## \{e_i\}_{i=1}^n ## and write an arbitrary vector in ## X ## as ## x = \sum_{i=1}^{n}x_i e_i ##. Then we have ## x^*(x) = x^*(\sum_{i=1}^{n} x_i e_i) = \sum_{i=1}^{n}x_i x^*(e_i) ##. Now we need to find a ## y \in X ## such that ## f(y) = x^*##, i.e., such that ## (y|x) = x^*(x) ## for all ## x \in X ##. For any ## y \in X ## we can write ## y = \sum_{j=1}^{n} y_j e_j ##, so maybe choosing the ## y_j ## appropriately will do the job. We have ## (y|x) = (\sum_{j=1}^{n} y_j e_j | \sum_{i=1}^{n} x_i e_i) = \sum_{j} \sum_{i} y_j \overline{x}_i (e_j|e_i) = \sum_{i} y_i \overline{x}_i ##. The problem is that I want this to equal ## x^*(x) = \sum_{i=1}^{n}x_i x^*(e_i) ##, but the former expression involves ## \overline{x}_i ## while the latter involves ## x_i ##. If I had used the physicists' convention and made the inner product conjugate linear in the first argument instead, this problem would not have occurred. But it seems strange that a theorem like this would be dependent upon a convention like that. Any insights?
Are you assuming a default inner product defined?
 
  • Like
Likes micromass
WWGD said:
Are you assuming a default inner product defined?

I think so? I am defining a sequilinear mapping ## X \times X \rightarrow \mathbb{C} ## such that ## (x,y) \mapsto (x|y) ##, where

## (x|y) = \overline{(y|x)} ##, and
## (\alpha x + \beta y|z) = \alpha(x|z) + \beta(y|z) ##.

I ended up just writing a note in the margin pointing out that ## (x|\cdot) \in X^* ## if one uses the physics convention; otherwise, we have ## (\cdot|x) \in X^* ## as Samy_A suggested.
 
Last edited:

Similar threads

  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
839
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K