1. Jun 20, 2009

### evilpostingmong

make much sense to me (I mean, even basic properties). I
<x+y, z> is <x,z>+<y,z> what is the purpose of doing this?
I'm almost completely clueless about inner products.

2. Jun 20, 2009

### Phrak

I'm not a fan of that notation. Are you familiar with upper and lower indices?

3. Jun 20, 2009

### tiny-tim

Hi evilpostingmong!

It's the same rule as for dot-products of ordinary 3D vectors …

(a + b).c = a.c + b.c

4. Jun 20, 2009

### evilpostingmong

No, but if it would help, I'm willing to hear about them, since I'm really stuck.
Don't mean to sound pushy, just in a desperate mood.

5. Jun 20, 2009

### HallsofIvy

What they are saying is that the inner product is linear in the first variable. Don't you think "linear" is an important property in Linear algebra?

6. Jun 20, 2009

### evilpostingmong

Yes, it is important.

7. Jun 20, 2009

### Hurkyl

Staff Emeritus
It's hard to give a good answer without more context (e.g. what is your background? What are you actually studying?), but I'll make a try anyways.

The big overarching incredible point about algebra is that it's not just something you do with numbers. You do also do it with vectors, matrices, sets, geometric shapes -- pretty much anything you would ever want to study can be studied with some algebraic technique.

Linear algebra has a special role, because it is the simplest kind of algebra, and something we understand really, reallly, really well -- and yet it is powerful enough to be useful in a wide variety of situations.

In order to do algebra, we need to know how to manipulate equations. The reason you learn the law $\langle \vec{x}+\vec{y}, \vec{z} \rangle = \langle \vec{x}, \vec{z}\rangle + \langle \vec{y}, \vec{z} \rangle$ for manipulating vectors is exactly the same as the reason you learn the law $a(b+c) = ab + ac$ for manipulating numbers.

8. Jun 20, 2009

### evilpostingmong

Thanks for the input. But what confuses me (I was so confused that
I couldn't even figure out what exactly was confusing..I know) is
that the inner product is supposed to be a scalar but when
computing it <x+y, z>=<x,z>+<y,z> <x,z> and <y,z> look like
bases containing vectors, not scalars. I guess that's why
Phrak doesn't like this notation. Those don't look like scalars, so I
don't really know what x or y or z are. Unless that "," means multiplication.
Oh btw to answer Hurky's question, as far as linear algebra is concerned.
I know matrix arithmetic, vector spaces, linear transformations, and "eigenstuff".

9. Jun 20, 2009

### Hurkyl

Staff Emeritus
If inner products are scalar-valued...
And <x,z> denotes an inner product of x with z...
Then <x,z> is a scalar.

I'm not sure what you mean by "bases containing vectors".

10. Jun 20, 2009

### evilpostingmong

It's alright, you know how when you want to write out a basis
for a vector space of dim n you'd put <v1...vn>
Oh hold on x+y and z are vectors but you performed the cross
product to get xy and xz, which are scalars.
But why couldn't they write it as <x*y> or <x*z> isn't that more obvious?
I'm not blaming you or anyone else on this forum, so don't worry about that.
Wait, I thought of something. Take vector u to be a row [1 2] and v to be a column [3 4]
so the result is [4 6] after multilpication so I end up with (for <u, v>) u is the scalar 4 and v
is the scalar 6.

Last edited: Jun 20, 2009
11. Jun 21, 2009

### Hurkyl

Staff Emeritus
(Convention for this post: elements of Rn are treated as if they were nx1 matrices)

There aren't that many symbols useful for a binary operation. Asterisks (i.e. '*') are annoying to draw by hand, if you have to do it a lot. The dot (i.e. '$\cdot$') is good, but people often reserve that for one specific inner product: the dot product on Rn; i.e. the one given by $\vec{v} \cdot \vec{w} = v^T w$.

It's awkward to write the dot product as if it were ordinary multiplication, because 'ordinary multiplication' notation is already overused in linear algebra. e.g. if r is a real number, $\vec{v}$ is a vector, and A,B are matrices of the right shape, and T a linear transformation of the right domain, we have

* Scalar multiplication: $r \vec{v}$
* Scalar multiplication: $r A$
* Matrix-matrix product: $A B$
* Matrix-vector product: $A v$
* Applying a transformation: $T v$

I think it should be preferable to let ordinary multiplication denote only those 'products' that are either scalar multiplication or come from matrix arithmetic (or similar).

Another advantage to some sort of bracket notation (e.g. $\{ a, b \}$ or $\langle a, b \rangle$ or $[ a, b ]$) is that it can be more easily annotated. If you're working with two different inner products, we can label one G and the other H, and write $\langle \vec{v}, \vec{w} \rangle_G$ and $\langle \vec{v}, \vec{w} \rangle_H$ to tell them apart.

How did the notation actually originate? I don't know. I can speculate, though: I bet the it was originally written as an ordinary binary function: e.g. g(x,y). People got tired of writing g all the time, because you often work with only one inner product at a time, so it got shortened to (x,y). But it can be confusing to use parentheses, so they switched to angle brackets $\langle x, y \rangle$.

12. Jun 21, 2009

### HallsofIvy

I hope you are not confusing <x, y> with {x, y}! If so, you need to have your vision checked. x, y, and z are vectors here. So is x+ y. <x+y, z>, <x, z>, and <y, z> are all inner products of vectors and so are scalars. <x, z>+ <y, z> is a sum of scalars, equal to the scalar <x+ y, z>.

I presume that the first statement in the definition of "inner product" in your textbook is that the inner product "is a function from VxV to the underlying field": that is, that the inner product, symbolized by < , >, takes two vectors, say x and y, and changes them to the scalar <x, y>. From that <x, z>, <y, z>, and <x+ y, z> certainly should "look like scalars"! Anything of the form <u, v> is a scalar.

Last edited by a moderator: Jun 21, 2009
13. Jun 21, 2009

### tiny-tim

Hi Hurkyl!

I always thought Dirac invented the angle bracket notation, so that he could put straight dividers inside, and called them bra and ket …

or did they already exist, and he only gave them the humorous name?

14. Jun 21, 2009

### Hurkyl

Staff Emeritus
This is how I thought history went. (And also other bracket operators pre-existed, like the Poisson bracket) But I don't have great confidence in my knowledge of such things. Your version is entirely plausible.

15. Jun 21, 2009

### evilpostingmong

Oh, its ok I have a book that uses <> for bases as well, but I guess it was a stupid
idea for the book to use <> for bases since it would confuse students who are
learning linear algebra. But you have shown me what they are normally used for,
thank you very much! And thanks Hurkyl for solving my , and * dilemma!
Thank you all for responding!
Oh btw Hurkyl, when you say "normal multiplication" you mean take two vectors and multiply them
like this uv as opposed to dot product, which is uTv which is what "," denotes, right?

Last edited: Jun 21, 2009