# Natural isomorphism of VxV* and End(V)

• Don Aman
In summary: V*tensW admits a natural map to Hom(V,W), which is an isomorphism of functors of two variables, when the spaces are finite dimensional.
Don Aman
I'm looking for help constructing the natural isomorphism between $V\otimes V^*$ and $\operatorname{End}(V)$, with V a vector space.

So far, I think I should have functors F and G which take $V \mapsto V\otimes V^*$ and $V \mapsto \operatorname{End}(V)$. I'm having a little trouble figuring out how the functors should act on morphisms though. For example, the only sensible thing that I can get F(f) to be is the morphism $v\otimes \sigma \mapsto f(v)\otimes (f^{-1})^*\sigma$. Only, here I have to assume that f is invertible, which I don't want. The functor should be defined for all morphisms, right?

thanks
-Don

Firstly this is only true for finite dimensional vector spaces.

You do not need a functor - you need an isomorphism.

Try sending the element [tex]v\otimes \theta[tex] to an endomorphism of V, such as g, where g(w) = \theta(w)v. now try some dimension counting.

There is a functor approeach, but I dont' think you should go down that road (tensor is adjoint with hom plus Yoneda).

matt grime said:
Firstly this is only true for finite dimensional vector spaces.
right.

matt grime said:
You do not need a functor - you need an isomorphism.
OK, but I want to show that this is a natural isomorphism of functors.

matt grime said:
Try sending the element [tex]v\otimes \theta[tex] to an endomorphism of V, such as g, where g(w) = \theta(w)v. now try some dimension counting.
yep.

matt grime said:
There is a functor approeach, but I dont' think you should go down that road (tensor is adjoint with hom plus Yoneda).
what's wrong with the functor approach? Why shouldn't I go down that road?

because it is unnecessarily complicated

Last edited:
to matt: thanks for your help

anyone else here willing to help me with my functors?

well, the other thing about the functor approach is that to show that there is a natural transformation one probably ought to start by showing that there is an isomorphism between the objects V\otimes V^* and End(V).

To start on this approach without bothering to write down an obvious isomorphism, I would start by attempting to create a representable functor, and apply Yoneda to it.

The reason why I don't think this is a good idea is that duality and hom are only adjoint on one side for categories other than simple ones such as finite dimensional vector spaces (that is if duality even exits), so a purely functor theoretic approach may not work out very easily, and at some point you will implicitly use the isomorphism constructed above.

whetehr you want a functorial isomorihsm or not, matt's advice is still correct. i.e. he is telling you how to write down an isomorphism between any two VALUES of the two functors. But that is exactly what an isomorphism of fucntors is:

i.e. an isomoprphism of functors F,G means
1)that for every object V you give an isomorphism F(V)-->G(V).
2) THEN you show that these isomolrphisms are natural with respect to changing the space V, i.e. that for every map V-->W, you have corresponding maps F(V)-->F(W), and G(V)-->G(W), and you show these commute with the maps F(V)-->G(V), and F(W)-->G(W) from part 1.

So in any case do what matt said first, i.e. find a natural looking map
VtensV*-->Hom(V,V), i.e. a bilinear map VxV*-->Hom(V,V).

later worry about the functorial aspect of it. oh by the way, as matt's adjointness remark implies, VtensV* is not really a functor of V, i.e. it is neither covariant nor contravariant. So you are going to have trouble with this problem.

I.e. the correct statement is probably that V*tensW admits a natural map to Hom(V,W), which is an isomorphism of functors of two variables, when the spaces are finite dimensional. Buit these functors of two variables are of opposite variance in each variable. of course there are artificial ways to hide that by working in the "opposite" category but they confuse me.

Last edited:
here is an other fancy schmancy approach which matt is hinting at: the [linear] functor V*tens(W) = F(W) of W is characterized by two properties:
1) it is right exact.
2) it commutes with direct sums.
3) F(R) = V*

So if Hom(V,.) also has these properties, then it is isomorphic to the functor V*tens(.).
[This is an old theorem of Sammy Eilenberg and [independently] Charles Watts I believe, and whose proof is trivial, at least if you like this kind of thing.]

E.g. to show Hom(V,.) is right exact, let t:A-->B be a surjection. then we have to show that the map Hom(V,A)-->Hom(V,B) taking f to (tof) is also a surjection. But since V is a vector space, we can always lift any map out of it through a surjection.

The problem comes in asking whether Hom(V,.) commutes with direct sums, since it does not in general, i.e. it commutes with direct products. That is why these functors actually are not the same in general.

Of course finite direct sums and finite direct products are the same thing, so maybe (?) this shows these functors agree on finite dimensional vector spaces, where only finite direct sums are allowed.

Finally, of course F(R) = Hom(V,R) = V*, so the last property is true.

Lets try this investigation a little differently: Let's ask whether the functor Hom(V*,W) = G(V), is isomorphic to F(V) = VtensW. I.e. prop. 1) asks if G commutes with direct sums. Again we have problems because the functor (.)* changes direct sums into direct products, although again they are ok for finite sums. is that ok??

Then we ask if G is right exact, i.e. given a surjection t:A-->B, and corresponding map

t*:B*-->A*, is the corresponding map Hom(A*,W)-->Hom(B*,W) taking f to (fot*) a surjection? Well t* is injective, so the point is whether every map out of the subspace B* extends to map out of the bigger space A*, which is ok in finite dimensions.

Finally we ask if G(R) is W? But G(R) = Hom(R*,W) is certainly W since R* = R.

So for me the very abstract functorial approach is iffy because of the unnaturak restriction to the category of finite dimensional spaces, to which my store of general theorems on the category of modules would need modification.

I will say I enjoyed getting a less basic, hence more interesting, question. This seems to be sort of a first year grad algebra question, or upper undergrad, at a strong school, right? Anyway, thanks for the memories.

Last edited:
mathwonk said:
whetehr you want a functorial isomorihsm or not, matt's advice is still correct. i.e. he is telling you how to write down an isomorphism between any two VALUES of the two functors. But that is exactly what an isomorphism of fucntors is:

i.e. an isomoprphism of functors F,G means
1)that for every object V you give an isomorphism F(V)-->G(V).
yes. And that's the isomorphism matt gave me. I actually already knew the mapping.

2) THEN you show that these isomolrphisms are natural with respect to changing the space V, i.e. that for every map V-->W, you have corresponding maps F(V)-->F(W), and G(V)-->G(W), and you show these commute with the maps F(V)-->G(V), and F(W)-->G(W) from part 1.
which is where I got stuck, since I could not construct the morphisms from F(V) to F(W) (nor those from G(V) to G(W)).

So in any case do what matt said first, i.e. find a natural looking map
VtensV*-->Hom(V,V), i.e. a bilinear map VxV*-->Hom(V,V).
Right. $u \mapsto \sigma(u)v$ is the mapping. As matt correctly stated.

later worry about the functorial aspect of it. oh by the way, as matt's adjointness remark implies, VtensV* is not really a functor of V, i.e. it is neither covariant nor contravariant. So you are going to have trouble with this problem.
Maybe this addresses my concern. Are these things even functors? Will we be able to satisfy F(fg) = F(f)F(g) if F is somehow "part covariant" and "part contravariant"? That's the trouble I was running into, trying to construct the functors.

I.e. the correct statement is probably that V*tensW admits a natural map to Hom(V,W), which is an isomorphism of functors of two variables, when the spaces are finite dimensional. Buit these functors of two variables are of opposite variance in each variable. of course there are artificial ways to hide that by working in the "opposite" category but they confuse me.
Hmm.. maybe that's what I need to do.

mathwonk said:
here is an other fancy schmancy approach which matt is hinting at: the [linear] functor V*tens(W) = F(W) of W is characterized by two properties:
1) it is right exact.
2) it commutes with direct sums.
3) F(R) = V*

So if Hom(V,.) also has these properties, then it is isomorphic to the functor V*tens(.).
[This is an old theorem of Sammy Eilenberg and [independently] Charles Watts I believe, and whose proof is trivial, at least if you like this kind of thing.]
I'm not familiar with the theorem.

So for me the very abstract functorial approach is iffy because of the unnaturak restriction to the category of finite dimensional spaces, to which my store of general theorems on the category of modules would need modification.
You know, if you dislike the restriction to finite dimensional spaces, I would be happy to just show a natural injection between functors, instead of a natural isomorphism. Which I think will exist in the arbitrary dimensional case, right?

I will say I enjoyed getting a less basic, hence more interesting, question.
I'm glad to hear you say it's not basis, that way I don't feel bad for not getting it right. I thought it was going to be a simple exercise in natural isomorphisms, but I think I was wrong.
This seems to be sort of a first year grad algebra question, or upper undergrad, at a strong school, right? Anyway, thanks for the memories.
yep.

It might help if you just take your head out of category land and put it back into linear algebra land -- sometimes just trying to refocus your brain helps immensely.

Your question seems to be that you need to find a way to relate End(V), End(W), and a linear map T:V-->W. I guess we know several ways to do this if T is invertible: for example, for S:V-->V, we have $T S T^{-1}:W \rightarrow R$. Or, for R:W-->W we have $T^{-1} R T:V \rightarrow V$...

T doesn't necessarily have to be an invertible map, though -- for example, an mxn matrix M (m < n) with full rank (i.e. m) will have a right inverse N such that MN = I. In fact, I think you can even go so far as to find a particular N such that MN = I and that NM is a diagonal matrix whose entries are m 1's and (n-m) 0's. N might even be uniquely determined by that criterion. Maybe you can replicate one of the above with these?

Last edited:
I did not say it was not a "simple exercise" in natural isomorphisms, but simple is a relative term, and the concept of "natural isomorphisms" themselves, i.e. functors, are not as basic as most of what we see here. Actually probably no exercise in functorial isomorphism is simple, or at least not brief, but they all do follow the same pattern.

Well, actually I see that your problem is with the one aspect of the problem that is genuinely confusing, i.e. the fact that it is not true, since the gadget is neither covariant nor contravariant in both variables. but i will indulge myself with some explanation anyway. please forgive me.

The fundamental rule to show things are isomorphic is to write down the most obvious map you can. then see if it is an isomorphism. after that the naturality part is usually automatic, if confusing.

As my superb teacher put it: "Write down the only map you can think of. If that isn't it, it takes a genius to come up with one. Then check that when you change spaces you get maps, and everything commutes. [thats naturality]. To show it is an isomorphism, try to write down a map in the other direction and do it all again, and show the two maps are inverses."

so here, we write down the obvious bilinear map V*xW-->Hom(V,W) taking (f,y) to the map taking x to f(x)y. This is bilinear hence induces a map V*tensW-->Hom(V,W). so we wrote down the most obvious maps.

Now to show it is an ismorphism is impossible unless we use some properties of finite dimensional spaces, and the most powerful one is the existence of bases, and the theory of dimension. I.e. there is not going to be a really natural map in the other direction. But for instance, both those spaces have the same dimension, so this map is an isomorphism if and only if it is surjective, or injective.

a basis for the target space is given by the maps g(i,j) taking xi to yj and all other xk to zero, where {xi} and {yj} are bases of V,W resp. (These are Hurkyl's matrices, having a 1 in only one entry.)

So try to show some element maps to this map g(i,j). Well, these things are always either trivial or impossible as my teacher said, so let's just close out eyes and write down the simplest possible element of V*tensW we can think of that involves xi and yj. hey, how about xi*(tens)yj? where xi* is the map taking xi to 1, and all other xk to zero.

Then the image map is the one taking w to xi*(w).yj, in particular taking xi to yj, and every other xk to 0.yj = 0. seems good. so the map is onto hence isom. now we have checked it is an isomorphism in finite dimensions.

i realize you had no trouble with any of this.

now for the naturality part. this part has nothing to do with the finite dimensionality. this part is the "simple exercise" in natural maps.

so what happens if we "change spaces"? i.e. take any map f:V-->V'. then we get a natural map from V'*-->V*, and hence V'*(tens)W -->V*(tens)W, and a natural map Hom(V',W)-->Hom(V,W). Is that ok so far?

so we have checked that "when you change spaces, you get maps."

then these compose somehow or other. i.e. I guess we can compose

V'*(tens)W -->V*(tens)W-->Hom(V,W), and also

V'*(tens)W -->Hom(V',W)-->Hom(V,W).

the simple exercise is to check these two compositions are equal. i.e. we check that when you change spaces and get maps, that "everything commutes."

let me try one not to be overly cavalier: say we start from the element

g(tens)w in V'*(tens)W, which goes to (gof)(tens)w in V*(tens)W.

Then it goes to the map taking v in V to g(f(v)).w.

Now in the other composition, we send g(tens)w to the map taking v' in V' to g(v').w, then to the map taking v in V to the image of f(v), i.e. sending v to

g(f(v)).w. this is the same result! (It always is in my experience. I.e. in this subject either it always checks out trivially like this, or you get stuck somewhere. It seldom comes out wrong.)

Now I suspect you had no trouble with this either and that your only question was the very valid one: "in what sense are these two guys functors of V?" and I say they are in fact not.

I.e. I say just do the exercise as a natural transformation of a pair of functors of two variables, with opposite variance in the two variables.

Of course that means you have to do the space - changing thing again by changing W instead of V.

And then if you look at it maybe, just maybe, you can interpret it somehow as a process of changing both spaces at once, but i doubt it in any meaningful sense.

Forgive me Hurkyl, I'm tired of matrices at the moment. As the great Emil Artin once wrote [roughly, in his book, Geometric Algebra]: "linear algebra should always be done insofar as possible without mentioning matrices. Proofs with matrices generally take twice as much space as those obtained by throwing the matrices out. Of course sometimes they cannot be dispensed with, e.g. one may need to compute a determinant."

Of course you weren't really using matrices, just the word "matrix", but any construction that requires the maps to be invertible is seldom "natural" in the sense of categories.

Last edited:
It just seemed more convenient to say that NM was a matrix whose diagonal entries are 1 and 0 than to say something like that NM was the identity on a subspace and the zero transformation on a complementary subspace... or that your basis elements were eigenvectors with eigenvalues 1 and 0.

Now that I think of it, though, I could just say NM is a projection. That would do it. Ah well.

mathwonk said:
Now I suspect you had no trouble with this either and that your only question was the very valid one: "in what sense are these two guys functors of V?" and I say they are in fact not.

I.e. I say just do the exercise as a natural transformation of a pair of functors of two variables, with opposite variance in the two variables.
OK. I see my mistake now. I think it will be easy to show naturality in the two arguments separately. Thank you.

And then if you look at it maybe, just maybe, you can interpret it somehow as a process of changing both spaces at once, but i doubt it in any meaningful sense.
I guess I would like to know if there is a way we can treat the two arguments in one go, maybe I'll poke around in my textbook a bit some time, but for now, I'm happy with the answer you've given. Thanks again.

what textbook are you using, if i may ask?

By the way here is another cute little approach to characterizing the functors Hom(.,W), taken from the class notes I handed out in my my graduate algebra class. [Notes to the guys in the "academic advice to students" thread: here is another little gem that the student who skips class would never have learned to this day, in all likelihood.]

Theorem: If F is any linear functor which turns cokernels into kernels, and sums into products, and such that F(R) = W, then F is isomorphic as a functor to Hom(.,W).

[the proof in my lecture notes, without checking all details is only a few lines long.]

In the case of F(V) = V*(tens)W, we have a linear functor such that F(R) = W, and again for finite dimensional V and finite direct sums, the other two proeprties hold as well, so presumably the proof shows that in this case F(V) = Hom(.,W) for finite dimensional V.

By the way I could be motivated at some point to put this 60 page section of my notes, entitled "Hom, Duality and representable functors, tensor products and alternating products", on my website.

As to treating both arguments "at one go", you cannot, no matter how you stack it. You can pretend to, in the opposite category, by defining a map from V-->W to be a map from W to V, but that is just changing the names.

As my calc professor put it, "when faced with two things to check, check one, then the other - do not be like the ass between two bails of hay."

Last edited:
mathwonk said:
what textbook are you using, if i may ask?
We're using Jacobson Basic Algebra I, but most of the category theory is just lecture notes. I also have the second volume, which has a chapter on category theory.

As my calc professor put it, "when faced with two things to check, check one, then the other - do not be like the ass between two bails of hay."
I'm an ass!?

Thats the way I felt too, when he said that. my apologies. In the old days, professors were more blunt than is considered politically correct today.

Some of my questions e.g. were greeted with "But that's the stupid way of looking at it!". How often has that been heard in a classroom in the last 20 years, no matter what the poser?

Jacobson is a great expert and his book is superb, but the category theoretic statement you were asked to prove seems actually wrong, i.e. End(V) is apparently not really a functor of V, so the notes you are using may be a bit flawed.

If you wish another source, (independently written hence presumably with different mistakes) I have just finished scanning my 1996 class notes on Hom, Duality, and Representable Functors, and can send them as a large pdf file if desired, to an email address, if one is provided.

(In case I forget later, on page 41 of my notes, Z^n(tens)T should have been Z^n(plus)T)

Last edited:
As an exercise, see if you can prove this:

Theorem: If F is any linear functor which turns cokernels into kernels, and sums into products, and such that F(R) = R, then F is isomorphic as a functor to Hom(.,R), (the dual functor taking V to V^*).

mathwonk said:
Jacobson is a great expert and his book is superb, but the category theoretic statement you were asked to prove seems actually wrong, i.e. End(V) is apparently not really a functor of V, so the notes you are using may be a bit flawed.
Oh, the mistake is all mine, not Jacobson's.

If you wish another source, (independently written hence presumably with different mistakes) I have just finished scanning my 1996 class notes on Hom, Duality, and Representable Functors, and can send them as a large pdf file if desired, to an email address, if one is provided.

(In case I forget later, on page 41 of my notes, Z^n(tens)T should have been Z^n(plus)T)
um sure, I'd like to take a look. I'll PM you my email address

mathwonk said:
As an exercise, see if you can prove this:

Theorem: If F is any linear functor which turns cokernels into kernels, and sums into products, and such that F(R) = R, then F is isomorphic as a functor to Hom(.,R), (the dual functor taking V to V^*).
I'm working on this problem. I have to bone up on some of the concepts first, but it looks like a nice problem. Give me some time?

mathwonk said:
As an exercise, see if you can prove this:

Theorem: If F is any linear functor which turns cokernels into kernels, and sums into products, and such that F(R) = R, then F is isomorphic as a functor to Hom(.,R), (the dual functor taking V to V^*).

I sort of forgot about this problem you gave me while back, what with finals and stuff. I wanted to come back to it, but I didn't feel equipped enough, so I'm reading through your notes, which were quite enjoyable by the way, and I think I just got to the page where this result is proved. I guess I'll stop reading so I can think about it or try it.

I've also been thinking about another problem you assigned in those notes, where you ask the reader to specify a basis for $\operatorname{Hom}(\mathbb{Z}_2^\omega,\mathbb{Z}_2)$.

I think I've decided that the space doesn't have a constructible basis; we can only prove its existence by invoking AC, we can't actually write down the basis. Is that right?

i never said i could do those exercises!

## 1. What is natural isomorphism?

Natural isomorphism is a concept in mathematics that refers to an isomorphism between two structures that is compatible with the underlying structure of the objects, rather than being defined by an arbitrary choice of coordinates or bases.

## 2. What does VxV* represent in the context of natural isomorphism?

VxV* represents the tensor product of a vector space V and its dual space V*, which is defined as the set of all linear transformations from V to its field of scalars.

## 3. How is End(V) related to natural isomorphism?

End(V) is the set of all endomorphisms of a vector space V, which are linear transformations from V to itself. In the context of natural isomorphism, End(V) can be seen as a special case of VxV*, where the vector space and its dual space are the same.

## 4. Why is natural isomorphism important in mathematics?

Natural isomorphism is important because it allows for the comparison and connection of different mathematical structures, providing a deeper understanding of their underlying relationships and properties.

## 5. How is natural isomorphism applied in other fields of science?

Natural isomorphism has applications in various fields of science, such as physics, computer science, and biology. In physics, it is used to describe the symmetries and conservation laws of physical systems. In computer science, it is used in programming language design and type theory. In biology, it is used to study the structure and function of biomolecules.

• Linear and Abstract Algebra
Replies
7
Views
562
• Linear and Abstract Algebra
Replies
13
Views
2K
• Linear and Abstract Algebra
Replies
4
Views
1K
• Linear and Abstract Algebra
Replies
3
Views
3K
• Special and General Relativity
Replies
10
Views
781
• Math Proof Training and Practice
Replies
69
Views
4K
• Linear and Abstract Algebra
Replies
2
Views
2K
• Linear and Abstract Algebra
Replies
4
Views
1K
• Linear and Abstract Algebra
Replies
3
Views
2K
• Special and General Relativity
Replies
7
Views
2K