# Inner product

Hi I'm stuck on the following question and I have little idea as to how to proceed.

Note: I only know how to calculate eigenvalues of a matrix, I don't many applications of them(apart from finding powers of matrices). Also, I will denote the inner product by <a,b> rather than with circular brackets as it done in the question. I will incorporate these changes in the wording of the question.

a) Suppose that <u,v> is an inner product on R^n. Define n by n matrix $$A = \left[ {a_{ij} } \right]$$ by $$a_{ij} = \left\langle {e_i ,e_j } \right\rangle$$. Show that, if we regard $$\mathop u\limits^ \to = \left( {b_1 ,...,b_n } \right)$$ and $$\mathop v\limits^ \to = \left( {c_1 ,...,c_n } \right)$$ as column vectors, then:

$$\left\langle {u,v} \right\rangle = \mathop u\limits^ \to ^T A\mathop v\limits^ \to = \left[ {b_1 ,..,b_n } \right]\left[ {\begin{array}{*{20}c} {a_{11} } & \cdots & {a_{1n} } \\ \vdots & \vdots & \vdots \\ {a_{n1} } & \cdots & {a_{nn} } \\ \end{array}} \right]\left[ {\begin{array}{*{20}c} {c_1 } \\ \vdots \\ {c_n } \\ \end{array}} \right]$$... equation 1

b) Explain why the matrix A in part (a)is symmetric. Can the eigenvalues of A be complex, negative or zero? Justify your answer.

c) For any two vectors u = (a_1, a_2, a_3) and v = (b_1, b_2, b_3) in R^3, define a function:

$$g\left\langle {u,v} \right\rangle = 4a_1 b_1 + 2a_2 b_2 + 3a_3 b_3 + \sqrt 2 a_2 b_3 + \sqrt 2 a_3 b_2$$

Determine whether g<u,v> is an inner product on R^3. Justify your answers, either directly or by appealing to the answers of the previous parts (a) and (b(b).

a) I'm thinking that with the way things have been defined in the question, that every entry of A on the off-diagonal are zero since by definition $$i \ne j \Rightarrow \left\langle {e_i ,e_j } \right\rangle = 0$$. It looks like A is the indentity matrix.

Carrying out the matrix multiplication(the bit with the three matrices):

$$\left[ {b_1 ,..,b_n } \right]\left[ {\begin{array}{*{20}c} {a_{11} } & \cdots & {a_{1n} } \\ \vdots & \vdots & \vdots \\ {a_{n1} } & \cdots & {a_{nn} } \\ \end{array}} \right]\left[ {\begin{array}{*{20}c} {c_1 } \\ \vdots \\ {c_n } \\ \end{array}} \right] = \left[ {b_1 ,..,b_n } \right]\left[ {\begin{array}{*{20}c} {c_1 } \\ \vdots \\ {c_n } \\ \end{array}} \right] = \left[ {b_1 c_1 + ... + b_n c_n } \right]$$

This is a 1 by 1 matrix so it can be regarded as a real number right? I still don't see how this means that equation 1 is true. I mean the inner product isn't necessarily the dot product.

b) Why is A symmetric? It just is? The identity matrix is symmetric assuming that I haven't gotten definitions mixed up so why would an explanation be needed. I'm thinking I got A interpreted incorrectly. I'm not sure about the eigenvalues, isn't there some relationship between eigenvalues and symmetric matrices? The eigenvalues are the diagonal entries? If A is the identity matrix then the eigenvalue/s is 1?

c) As I indicated earlier I don't know enough about eigenvalues to use them to answer this question. Can someone help me out with this question and the others as well? Any help would be great thanks.

Last edited:

Hurkyl
Staff Emeritus
Gold Member
Benny said:
a) I'm thinking that with the way things have been defined in the question, that every entry of A on the off-diagonal are zero since by definition $$i \ne j \Rightarrow \left\langle {e_i ,e_j } \right\rangle = 0$$. It looks like A is the indentity matrix.
That definition is only for the Euclidean inner product (a.k.a. dot product) -- it need not hold for an arbitrary inner product.

I won't comment on (b) until you take another shot at (a).

As for (c), there are things that are very naturally associated with eigenvalues, are there not? So if you opted to play with the eigenvalues, you would probably want to play with those things too.

Hmm...the e_i made me automatically think dot product. Anyway...seeing as it is much easier to write the following out on paper than it is to 'Tex' it I'll just state what I would do and what I get.

Ok so just use the definition of A and its entries to perform the indicated matrix multiplicatio(the one on the right most side with the three matrices including A). Doing this I get a 1 by 1 matrix for which I will just write as an algebraic expression.

I can't recall all of the properties of the inner product(although I should be able to!) and coupled with other time constraints the following is probably littered with errors so please bare with me.

$$stuff = \left[ {\sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_1 } \right\rangle } + ... + \sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_n } \right\rangle } } \right]\left[ {\begin{array}{*{20}c} {c_1 } \\ \vdots \\ {c_n } \\ \end{array}} \right]$$

$$= c_1 \sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_1 } \right\rangle } + ... + c_n \sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_n } \right\rangle }$$

$$= c_1 \sum\limits_{i = 1}^n {\left\langle {b_i e_i ,e_1 } \right\rangle } + ... + c_n \sum\limits_{i = 1}^n {\left\langle {b_i e_i ,e_1 } \right\rangle } ...linearity?$$

At the moment I can't recall which other manipulations are required. If I haven't made any errors yet I'm thinking maybe rearrange the expression above? I'll think about this further but if you have any suggestions for the stage I'm currently at please reply.

Edit: Part c, eigenvectors? I know that x is an eigenvector of a matrix A if $$A\mathop x\limits^ \to = \lambda \mathop x\limits^ \to$$ but I can't think of a way to apply this. Perhaps I'll continue working on the first two parts before tyring this one.

Also, is A symmetric because of the symmetry product of the inner product? So that [a_1n] = <e_1, e_n> = <e_n, e_1> = [a_n1] where [brackets] denotes the entry.

Last edited:
Hurkyl
Staff Emeritus
Gold Member
Well, good notation is always helpful! First, I'll correct your mistake -- in your first line of LaTeX, I assume you did not mean to put +'s between the components of the vector.

It might help to write things like

$$\left[ \vec{u}^T A \right]_j = \vec{u}^T A_{\cdot j} = \sum_{i=1}^n b_i \langle e_i, e_j \rangle$$

to more compactly (and precisely) represent the product. (BTW, look at my source to see some time-saving shortcuts!) ($u_k$ means the k-th component. $A_{\cdot j}$ means the j-th column. Similarly, $A_{i \cdot} = A_i$ denotes the i-th row. Square brackets added for clarity as necessary)

Incidentally, it may be easier to transform both $\vec{u}^T A \vec{v}$ and $\langle \vec{u}, \vec{v} \rangle$ into the same thing, rather than transforming one into the other.

You have the symmetry of A correct -- it does indeed follow directly from the symmetry of the inner product.