Norm Satisfying the Parallelogram Law

jbunniii
Homework Helper
Insights Author
Messages
3,488
Reaction score
257
Let V be a vector space over the complex field.

If V has an inner product <\cdot,\cdot>, and ||\cdot|| is the induced norm, then it's easy to show that the norm must satisfy the parallelogram law, to wit:

||x+y||^2 + ||x-y||^2 = 2||x||^2 + 2||y||^2

Much more interestingly, given an arbitrary norm on V, there exists an inner product that induces that norm IF AND ONLY IF the norm satisfies the parallelogram law.

Horn and Johnson's "Matrix Analysis" contains a proof of the "IF" part, which is trickier than one might expect. I'm trying to produce a simpler proof.

It's not too hard to show that if there exists an inner product <\cdot,\cdot>on V which induces a norm ||\cdot||, then it must satisfy the polarization identity:

<u,v> = \frac{1}{4}\left(||u+v||^2 + ||u-v||^2 + i||u+iv||^2 - i||u-iv||^2\right)

So, to solve my problem, I assume that V is equipped with a norm that satisfies the parallelogram law, and then I define

p(u,v) = \frac{1}{4}\left(||u+v||^2 + ||u-v||^2 + i||u+iv||^2 - i||u-iv||^2\right)

and my goal is to show that p satisfies all the requirements of an inner product:

(1) p(u,u) \geq 0 for all u \in V, with equality iff u = 0
(2) p(u+v,w) = p(u,w) + p(v,w) for all u,v,w \in V
(3) p(\alpha v, w) = \alpha p(v,w) for all \alpha \in \mathbb{C}, and all v,w \in V
(4) p(v,w) = \overline{p(w,v)}

It's easy to show that (1) and (4) are true. With a bit of crafty manipulation, (2) is also not too hard.

Horn and Johnson use (2) to establish (3) first for integer values of \alpha and then for rational values. Then they resort to a limiting argument to show that (3) must also hold for all real (and then complex) \alpha.

I suspect that there is a more elementary, direct way to obtain (3) directly from the polarization identity, the parallelogram, and the norm axioms. I suspect this in part because this problem shows up as an exercise in Axler's "Linear Algebra Done Right," and if the Horn and Johnson solution really is the simplest, than that means this exercise is at least an order of magnitude harder than most of Axler's other exercises.

But I've been banging my head against the problem of trying to find such an elementary proof of (3), without success so far.

I have a hunch that there's a clever change of variables that will convert the parallelogram law

||x+y||^2 + ||x-y||^2 = 2||x||^2 + 2||y||^2

into a form that expresses

||x+y||^2 - ||x-y||^2 (note the sign change)

in a nice way that will allow me to "pull out" the \alpha from the various terms of the polarization identity:

p(\alpha x, y) = \frac{1}{4}\left(||\alpha x + y||^2 - ||\alpha x - y||^2 + i ||\alpha x + iy||^2 - i ||\alpha x - iy||^2\right)

but I haven't been able to find a trick that works. Any ideas?

P.S. Using Google Books, I found another book, "Elements of Operator Theory," by Carlos S. Kubrusly, which proves it the same way as Horn and Johnson, so maybe that really IS the simplest way?
 
Last edited:
Physics news on Phys.org
I know this is an old thread, but I happened to come across it and saw no answers.

The method you described is indeed the well-known one. When I did this exercise in Axler, I too thought it was way out of place in the text: pretty much all exercises there are trivially writing out definitions, or using theorems directly, while this one needs non-trivial facts from analysis (continuity of norm, densedness of rationals in reals).

You're probably going to have to use some analytic aspect somewhere - altough I know nothing about the possibility of even defining norms/inner products for other fields than R or C; at least an ordening is needed. Someone at mathoverflow asked the same question as you. Maybe you like the one described there:
The first is your proof, and the second involves first proving that for fixed u and v, |u + tv|^2 is a degree 2 polynomial in t (this is where continuity is used, together with arithmetic sequences). This is followed by an algebraic manipulation showing that the linear term of the polynomial is an inner product.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...

Similar threads

Replies
52
Views
3K
Replies
12
Views
3K
Replies
4
Views
2K
Replies
18
Views
2K
Replies
1
Views
3K
Back
Top