Solving Inner Product Questions with Eigenvalues

In summary, the conversation discusses the use of eigenvalues in linear algebra and their application in finding powers of matrices. It also explores the concept of inner products in relation to matrices and explains the symmetry of a matrix. The final question addresses whether a given function can be considered an inner product on a specific vector space, and suggests using eigenvalues in finding the answer.
  • #1
Benny
584
0
Hi I'm stuck on the following question and I have little idea as to how to proceed.

Note: I only know how to calculate eigenvalues of a matrix, I don't many applications of them(apart from finding powers of matrices). Also, I will denote the inner product by <a,b> rather than with circular brackets as it done in the question. I will incorporate these changes in the wording of the question.

a) Suppose that <u,v> is an inner product on R^n. Define n by n matrix [tex]A = \left[ {a_{ij} } \right][/tex] by [tex]a_{ij} = \left\langle {e_i ,e_j } \right\rangle [/tex]. Show that, if we regard [tex]\mathop u\limits^ \to = \left( {b_1 ,...,b_n } \right)[/tex] and [tex]\mathop v\limits^ \to = \left( {c_1 ,...,c_n } \right)[/tex] as column vectors, then:

[tex]
\left\langle {u,v} \right\rangle = \mathop u\limits^ \to ^T A\mathop v\limits^ \to = \left[ {b_1 ,..,b_n } \right]\left[ {\begin{array}{*{20}c}
{a_{11} } & \cdots & {a_{1n} } \\
\vdots & \vdots & \vdots \\
{a_{n1} } & \cdots & {a_{nn} } \\
\end{array}} \right]\left[ {\begin{array}{*{20}c}
{c_1 } \\
\vdots \\
{c_n } \\
\end{array}} \right]
[/tex]... equation 1

b) Explain why the matrix A in part (a)is symmetric. Can the eigenvalues of A be complex, negative or zero? Justify your answer.

c) For any two vectors u = (a_1, a_2, a_3) and v = (b_1, b_2, b_3) in R^3, define a function:

[tex]
g\left\langle {u,v} \right\rangle = 4a_1 b_1 + 2a_2 b_2 + 3a_3 b_3 + \sqrt 2 a_2 b_3 + \sqrt 2 a_3 b_2
[/tex]

Determine whether g<u,v> is an inner product on R^3. Justify your answers, either directly or by appealing to the answers of the previous parts (a) and (b(b).

a) I'm thinking that with the way things have been defined in the question, that every entry of A on the off-diagonal are zero since by definition [tex]i \ne j \Rightarrow \left\langle {e_i ,e_j } \right\rangle = 0[/tex]. It looks like A is the indentity matrix.

Carrying out the matrix multiplication(the bit with the three matrices):

[tex]
\left[ {b_1 ,..,b_n } \right]\left[ {\begin{array}{*{20}c}
{a_{11} } & \cdots & {a_{1n} } \\
\vdots & \vdots & \vdots \\
{a_{n1} } & \cdots & {a_{nn} } \\
\end{array}} \right]\left[ {\begin{array}{*{20}c}
{c_1 } \\
\vdots \\
{c_n } \\
\end{array}} \right] = \left[ {b_1 ,..,b_n } \right]\left[ {\begin{array}{*{20}c}
{c_1 } \\
\vdots \\
{c_n } \\
\end{array}} \right] = \left[ {b_1 c_1 + ... + b_n c_n } \right]
[/tex]

This is a 1 by 1 matrix so it can be regarded as a real number right? I still don't see how this means that equation 1 is true. I mean the inner product isn't necessarily the dot product.

b) Why is A symmetric? It just is? The identity matrix is symmetric assuming that I haven't gotten definitions mixed up so why would an explanation be needed. I'm thinking I got A interpreted incorrectly. I'm not sure about the eigenvalues, isn't there some relationship between eigenvalues and symmetric matrices? The eigenvalues are the diagonal entries? If A is the identity matrix then the eigenvalue/s is 1?

c) As I indicated earlier I don't know enough about eigenvalues to use them to answer this question. Can someone help me out with this question and the others as well? Any help would be great thanks.
 
Last edited:
Physics news on Phys.org
  • #2
Benny said:
a) I'm thinking that with the way things have been defined in the question, that every entry of A on the off-diagonal are zero since by definition [tex]i \ne j \Rightarrow \left\langle {e_i ,e_j } \right\rangle = 0[/tex]. It looks like A is the indentity matrix.
That definition is only for the Euclidean inner product (a.k.a. dot product) -- it need not hold for an arbitrary inner product.

I won't comment on (b) until you take another shot at (a).

As for (c), there are things that are very naturally associated with eigenvalues, are there not? So if you opted to play with the eigenvalues, you would probably want to play with those things too.
 
  • #3
Hmm...the e_i made me automatically think dot product. Anyway...seeing as it is much easier to write the following out on paper than it is to 'Tex' it I'll just state what I would do and what I get.

Ok so just use the definition of A and its entries to perform the indicated matrix multiplicatio(the one on the right most side with the three matrices including A). Doing this I get a 1 by 1 matrix for which I will just write as an algebraic expression.

I can't recall all of the properties of the inner product(although I should be able to!) and coupled with other time constraints the following is probably littered with errors so please bare with me.

[tex]
stuff = \left[ {\sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_1 } \right\rangle } + ... + \sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_n } \right\rangle } } \right]\left[ {\begin{array}{*{20}c}
{c_1 } \\
\vdots \\
{c_n } \\
\end{array}} \right]
[/tex]

[tex]
= c_1 \sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_1 } \right\rangle } + ... + c_n \sum\limits_{i = 1}^n {b_i \left\langle {e_i ,e_n } \right\rangle }
[/tex]

[tex]
= c_1 \sum\limits_{i = 1}^n {\left\langle {b_i e_i ,e_1 } \right\rangle } + ... + c_n \sum\limits_{i = 1}^n {\left\langle {b_i e_i ,e_1 } \right\rangle } ...linearity?
[/tex]

At the moment I can't recall which other manipulations are required. If I haven't made any errors yet I'm thinking maybe rearrange the expression above? I'll think about this further but if you have any suggestions for the stage I'm currently at please reply.

Edit: Part c, eigenvectors? I know that x is an eigenvector of a matrix A if [tex]A\mathop x\limits^ \to = \lambda \mathop x\limits^ \to [/tex] but I can't think of a way to apply this. Perhaps I'll continue working on the first two parts before tyring this one.

Also, is A symmetric because of the symmetry product of the inner product? So that [a_1n] = <e_1, e_n> = <e_n, e_1> = [a_n1] where [brackets] denotes the entry.
 
Last edited:
  • #4
Well, good notation is always helpful! First, I'll correct your mistake -- in your first line of LaTeX, I assume you did not mean to put +'s between the components of the vector. :smile:

It might help to write things like

[tex]
\left[ \vec{u}^T A \right]_j = \vec{u}^T A_{\cdot j} = \sum_{i=1}^n b_i \langle e_i, e_j \rangle
[/tex]

to more compactly (and precisely) represent the product. (BTW, look at my source to see some time-saving shortcuts!) ([itex]u_k[/itex] means the k-th component. [itex]A_{\cdot j}[/itex] means the j-th column. Similarly, [itex]A_{i \cdot} = A_i[/itex] denotes the i-th row. Square brackets added for clarity as necessary)

Incidentally, it may be easier to transform both [itex]\vec{u}^T A \vec{v}[/itex] and [itex]\langle \vec{u}, \vec{v} \rangle[/itex] into the same thing, rather than transforming one into the other.

You have the symmetry of A correct -- it does indeed follow directly from the symmetry of the inner product.
 

Related to Solving Inner Product Questions with Eigenvalues

1. What is an inner product?

An inner product is a mathematical operation that takes two vectors and produces a scalar value. It is often denoted as ⟨u, v⟩ and satisfies certain properties such as commutativity and linearity.

2. How are eigenvalues used in solving inner product questions?

Eigenvalues are used to find the eigenvectors of a matrix, which are then used to calculate the inner product. By finding the eigenvectors, we can determine the direction and magnitude of the vectors involved in the inner product, making it easier to solve.

3. What is the relationship between eigenvalues and inner products?

The eigenvalues of a matrix are closely related to the inner product because the eigenvectors are used to calculate the inner product. The eigenvalues correspond to the scalar values produced by the inner product operation.

4. How are inner product questions useful in science?

Inner product questions are useful in a variety of scientific fields, including physics, engineering, and computer science. They allow us to calculate the angle between two vectors, find the length of a vector, and solve optimization problems.

5. Can inner product questions be solved without using eigenvalues?

Yes, inner product questions can be solved without using eigenvalues. However, using eigenvalues can often simplify the calculations and make the solution process more efficient. In some cases, it may be necessary to use other techniques, such as orthogonal projections, to solve inner product questions without using eigenvalues.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
571
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
17
Views
1K
Replies
27
Views
1K
  • Calculus
Replies
4
Views
589
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Quantum Physics
Replies
2
Views
1K
  • Calculus
Replies
8
Views
878
Replies
1
Views
988
Back
Top