# Example of an algebra tensor product

1. Dec 4, 2015

### Geofleur

On pages 67 & 68 of Hassani's mathematical physics book, he gives the following definition:

"Let $\mathcal{A}$ and $\mathcal{B}$ be algebras. The the vector space tensor product $\mathcal{A} \otimes \mathcal{B}$ becomes an algebra tensor product if we define the product

$(\mathbf{a}_1 \otimes \mathbf{b}_1)(\mathbf{a}_2\otimes\mathbf{b_2})=\mathbf{a}_1\mathbf{a}_2\otimes\mathbf{b}_1\mathbf{b}_2$

on $\mathcal{A}\otimes\mathcal{B}$."

He goes on to say that, because the spaces $\mathcal{A}\otimes\mathcal{B}$ and $\mathcal{B}\otimes\mathcal{A}$ are isomorphic, we require that $\mathbf{a}\otimes\mathbf{b} = \mathbf{b}\otimes\mathbf{a}$ for all $\mathbf{a}\in\mathcal{A}$ and $\mathbf{b}\in\mathcal{B}$.

So far, so good. But then he goes on to say that this last requirement is important when an algebra $\mathcal{A}$ is written as the tensor product of two of its subalgebras $\mathcal{B}$ and $\mathcal{C}$; also, that $\otimes$ in such a case is identified with the multiplication in $\mathcal{A}$.

I have been trying for a week now to come up with an example to help me make sense of these remarks. Firstly, does anyone know of an example of an algebra that can be written as the tensor product of two of its own subalgebras? Secondly, I have always thought that $\mathbf{a}\otimes\mathbf{b}$ is just shorthand for an element, $(|a\rangle, |b\rangle)$, of the Cartesian product of the underlying vector spaces. So how can $\otimes$ be the same as the multiplication in $\mathcal{A}$?

2. Dec 4, 2015

### Staff: Mentor

That's not an easy task. The tensor product is kind of a universal construction. (Please don't urge me to prove it in the strict homological sense. I think it is true even there but I'm not sure.) To make it useful one usually factor something out. Even your example is strictly speaking a quotient: $\mathcal{A} ⊗ \mathcal{B} / <a⊗b - b⊗a >$. So what comes to my mind to answer your first question is a trivial example. The tensor product of two vectors span, or better the span of all tensor products of two vectors, result in the whole matrix algebra and they also can be embedded in it. The tensor product of just two vectors can be seen as a matrix multiplication of a row vector with a column vector or vice versa. (I remember we had an example in a previous thread where the quotient represented Pauli's exclusion principle. I just forgot whether it's been a tensor algebra or a Graßmann algebra to which similar properties with respect to universality hold.)
To your second question: a tensor product is bilinear (matrix multiplication) whereas the Cartesian product is per definition just a concatenation of components, so, yes, there is a difference.
The tensor product ⊗ can be the multiplication in an algebra, e.g. if there is no other multiplication, either defined or simply not considered. In the case $\mathcal{A} = \mathcal{B} ⊗ \mathcal{C}$ for subalgebras of $\mathcal{A}$ it looks natural to consider the multiplication defined by the tensor product.

I'm sure you wanted to here something more exciting. But excitement comes through quotients or subalgebras.

3. Dec 5, 2015

### lavinia

Let $P[X,Y]$ be the ring of polynomials in the commuting indeterminates,$X$ and $Y$, over a commutative ring, $R$ (with unit). $P[X,Y]$ is an $R$ algebra under polynomial multiplication. The R-algebras, $P[X]$ and $P[Y]$ are subalgebras.

Let $H:P[X]⊗P[Y] \rightarrow P[X,Y]$ be the R-algebra homomorphism, $H(p(X)⊗q(Y) ) = p(X)q(Y)$. $H$ is clearly injective. $H$ is onto since every polynomial in $P[X,Y]$ can be written as a product of the form, $p(X)q(Y)$.

In the Cartesian product of vector spaces, $(rV,W)$ and $(V,rW)$ are different. But $(rV)⊗_{R}W$ and $V⊗_{R}rW$ are the same.

Last edited: Dec 5, 2015
4. Dec 5, 2015

### Geofleur

@lavinia: Let's see if I understand correctly what you said. If we take the commutative ring, $R$, to be $\mathbb{Z}$, then $P[X,Y]$ has elements such as $1 + 2XY + X^2 + 5Y^3 = 1 + 2YX + X^2 + 5Y^3$. If this is right, then it's clear to me that $P[X]$ and $P[Y]$ are subalgebras of $P[X,Y]$. When you say that every polynomial in $P[X,Y]$ can be written as a product $p(X)q(Y)$, does that mean, e.g., that $p(X,Y) = X^2 + 2XY + Y^3$ can be written as a product of a polynomial in $X$ times one in $Y$? Or were you just saying that each individual term, such as $2XY$, can be written in this way?

I am not sure I know what the $rV$ in $(rV,W)$ means. If the tensor product in the polynomial example just amounts to multiplication of polynomials in the commuting indeterminates $X$ and $Y$, I think I see how the order in the tensor product would not matter. The homomorphism $H$ converts the cartesian products into multiplications, so that reversing the order in the product has no effect on the result.

5. Dec 5, 2015

### Samy_A

I don't understand this either. Maybe it should be "every polynomial in $P[X,Y]$ can be written as a sum of products of the form, $p(X)q(Y)$?

6. Dec 5, 2015

### lavinia

You are right What I said is wrong. What is correct is that every polynomial can be written as a sum, $p[X,Y] =Σr_{m,n}(X^m)Y^n$ So $H(Σr_{m,n}(X^m)⊗Y^n) = p$

7. Dec 5, 2015

### lavinia

yes. My error.

8. Dec 5, 2015

### lavinia

$(rV,W)$ and $(V,rW)$ are not in the tensor product but in the Cartesian product. I was trying to illustrate the difference between the Cartesian product and the tensor product. $r$ here is an element of the ring,$R$.

9. Dec 5, 2015

### Geofleur

Ah - I understand!

10. Dec 5, 2015

### Staff: Mentor

This is especially important when it comes to coordinates. It means that, e.g. for multiples of $1$, the center of the matrix algebra, you cannot say where this scalar factor belongs to. It can freely move from one to the other or "outside". In a Cartesian product you have to decide it.

11. Dec 6, 2015

### mathwonk

In algebraic geometry one sets up an equivalence of categories between affine algebraic varieties and certain commutative rings, or between abstract affine schemes and all commutative rings. This correspondence is arrow reversing, i.e. if X, Y are the varieties whose rings of functions are R and S respectively, then a geometric map X-->Y corresponds to a ring map S-->R, by pulling back polynomial functions on Y to polynomial functions on X.

Notice the universal property of a tensor product RtensS is that a ring map out of RtensS is determined uniquely by a pair of ring maps, one out of R and one out of S. Reversing the arrows, if Z is the variety corresponding to the tensor product ring, we see the corresponding geometric maps determine a unique map into Z for every pair of maps, one into X and one into Y. Does that sound familiar? That is the mapping property of a geometric product space. Thus Z must be the product space Z ≈ XxY.

I.e. the tensor product of the coordinate rings of two affine algebraic varieties is the coordinate ring of the product of those varieties. Since the coordinate ring of affine n space is the polynomial ring in n variables, and the product of the affine spaces k^n and k^m is k^(n+m), a special case of this is the fact that the tensor product of polynomial rings in n and m variables, is just the polynomial ring in n+m variables.

So to answer the request for insight into when an algebra A is the tensor product of two subalgebras B and C, this occurs in the commutative case precisely when the affine variety determined by A is the geometric product of two of its affine subvarieties.

here is a writeup of a related exercise in hartshorne:

http://sierra.nmsu.edu/morandi/oldwebpages/Math683Fall2013/Product.pdf

the following notes are more detailed, see especially pages 12-15:

https://www.math.upenn.edu/~siegelch/Notes/ag.pdf

note further that if A and B are algebras over a 3rd ring C, then the tensor product over C is a "fibered product", i.e. a product in which all varieties concerned are equipped with a map into the variety of C, and we taking a product of these maps. In the elementary case we assumed the ring C is a field and hence its variety is a point, so nothing more is obtained from this additional structure.

Last edited: Dec 6, 2015
12. Dec 7, 2015

### Geofleur

Your post is forcing me to learn some algebraic geometry, which I view as a good thing! I will see if I can come up with a good concrete example that illustrates what you said :-)

EDIT: I'm going to have to work harder than I initially thought, but I'm determined

Last edited: Dec 7, 2015
13. Dec 7, 2015

### Geofleur

OK, I ordered a copy of Miles Reid's book. Maybe I'll just start a new thread once I think I've got something - this is gonna take a while.

14. Dec 8, 2015

### mathwonk

Miles Reid's Undergraduate algebraic geometry is a good book, but will not give you this categorical notion of affine products in relation to tensor products. but the first 15 pages of the second link I gave above will, the notes from Charles Siegel at UPenn, based on the terse little book Algebraic varieties, by George Kempf.
https://www.amazon.com/Algebraic-Va...-1&keywords=george+kempf,+algebraic+varieties

I first learned it from Mumford's "red book" of algebraic varieties.

https://www.amazon.com/Red-Book-Var...1449594259&sr=1-1&keywords=mumford's+red+book

heres a cheaper used copy, but maybe without the nice little addition of his lectures on curves from Michigan:

http://www.abebooks.com/servlet/SearchResults?an=david+mumford&sts=t&tn=red+book

15. Dec 8, 2015

### Staff: Mentor

Every time I have a look in mine I find it rather "categorial" and more like a textbook on local rings. Beside Strassen's work on algorithmic manifolds I've never seen something of an application to it. Can you give me some hints where this theory is of any use?

16. Dec 8, 2015

### mathwonk

well what i meant by not categorical was that miles reid treats varieties and products as embedded, not in terms of their mapping properties as mumford does. miles' book is a mixture of abstracta nd concrete approaches as he says in his "woffle", and he does discuss the category of affine varieties, e.g. he discusses the static relation ship between subvarieties of affine space and ideals of polynomial rings early on, and on page 69 he gives the dynamic correspondence between maps in the two categories, varieties and algebras. But since he does not assume a knowledge of tensor products on the part of the reader, he cannot give the categorical treatment of products using them.

As to what algebraic geometry is useful for, it depends who you are and what uses interest you. Reid refers to applications in number theory, computer algebra, string theory, and so on, and I for instance just used it to answer the OP's question about structure of tensor product algebras. For instance when I lectured on Jacobian varieties at the International Centre for Theoretical Physics in 1987 I understood the physicists there intended to use what they learned in string theory. The first use of varieties Reid refers to in his woffle that caught my eye, is at the bottom of page 4, line -5, where he says we can have quite a lot of fun with them. I am not familiar with Strassen's use of it in algorithmic manifolds, if that is the area of interest to you. Is this the algorithm for fast matrix multiplication? or an algorithm for classifying topological manifolds? In my experience, a subject that one is aware of can be found a use for, but not ones of which one is ignorant. I at least have found that the case when someone else more sophisticated or better educated than me used something unfamiliar to me, to solve a problem I was working on unsuccessfully. I.e. as a general rule, one is advised to learn a bit of topics generally thought to be important even before knowing how one will apply them, at least if one wants to be early in the hunt for new results. But the main use for me personally is to understand nature and enjoy the intellectual process.

At bottom, algebraic geometry teaches one how to apply geometric intuition to algebra (as in the answer to the OP's question), and in the other direction, how to use algebra to make geometric intuition precise.

If you are interested in codes, this paper may interest you:
http://www.win.tue.nl/~ruudp/paper/14.pdf

Last edited: Dec 8, 2015
17. Dec 8, 2015

### Staff: Mentor

As far as I remember he used it to improve the ω in the runtime estimation of it $(O(n^ω))$. Instead of searching single algorithms (as within a closed subset) he considered their generic (open sets) behavior. But it's too long ago that I remember details. Mainly I remember my astonishment how he managed to get something very concrete out a concept which in the beginning looked simply pure mathematical and abstract.

A bracing optimistic point of view. E.g. I once found a funny invariant on Lie Algebras, unfortunately $\{0\}$ for the semisimple ones but I'm not bright enough to make something out of it. (Ignore that please, if I already have breached the rules with that remark.)

I totally agree on. The reason for my question has been driven by the desire to narrow the gap between those coordinate rings and physical coordinate systems. My textbook doesn't supply many examples, mostly of the kind $V(x^2 - y^3)$ in $ℂ^2$ or derivations as tangent spaces.

18. Dec 8, 2015

### mathwonk

I find it quite hard to extract much gometry from commutative ring theory myself, and am still not at all expert on that topic, in spite of being a functioning (retired) algebraic geometer. I got most of my mileage out of analytic and purely geometric tools, but kept trying to fathom the abstract algebra. The first book that taught algebraic geometry with a good grip of the geometry was for me Shafarevich, Basic Algebraic Geometry, which I still like. It's not all commutative algebra and sheaf theory, but has lots of geometric insight and examples. I recommend taking a look there.

When I teach it I start by giving my class a feel for the topology of a complex plane curve in terms of its algebraic equation. the basic fact is the genus of the compact surface underlying a complex projective plane curve of degree d, equals (1/2)(d-1)(d-2), which of course is the triangular number 1+2+3+...+(d-2).

The reason for this is that genus of a surface is constant under deformation, so we deform our curve into a union of d general lines, which meet in such a way as to form that number of "holes", q.e.d. i.e. each "line" is homeomorphic to a sphere, and so we have a certain number of spheres meeting each other, which can be smoothed into a smooth surface of the appropriate genus, and whicha rose conversely from such a smooth surface by pinching off a certain number of loops in the surface. I have some notes in pdf form I could email if desired. or one could consult the beautiful and extremely careful and detailed book Algebraic Plane Curves, by Brieskorn and Knorrer. Another lovely book is Riemann surfaces and algebraic curves by Rick Miranda. On a special important topic, there are notes on my webpage on the classical, and some more modern, versions of the Riemann Roch theorem:
http://alpha.math.uga.edu/~roy/rrt.pdf

another good place to learn to combine polynomial aklgebra and geometry is the theory of toric varieties, discussed in a nice book by William Fulton, and in a somewhat scarce and now rather expensive book by Oda, Convex Bodies and algebraic geometry. (One used book seller in Maine, who I think ought to be ashamed, is asking over \$550 for a copy of this book.) So one should probably only consult this last work in a library.

Last edited: Dec 8, 2015