Prove A is symmetric iff x*Ay = Ax*y

1. Nov 6, 2009

s_j_sawyer

Hello everyone. This is my first official post here but I have been lurking around for about a year now.

1. The problem statement, all variables and given/known data
Prove that a matrix A is symmetric if and only if x*Ay = Ax*y for all x,y of R^n, where * denotes the dot product.

2. Relevant equations

3. The attempt at a solution

So I was able to do prove the forward direction, as follows:

Assume A is symmetric. Then

x*Ay = x^T A y
= x^T A^T y
= (Ax)^T y
= Ax * y as required.

However, I am completely stumped for the other direction. I.e., assuming x*Ay = Ax*y and then showing A = A^T.

Any suggestions?

2. Nov 6, 2009

lanedance

have you tried writing out the products in subscript notation?

3. Nov 6, 2009

lanedance

oh, and g'day & welcome to PF ;)

4. Nov 6, 2009

s_j_sawyer

What is this subscript notation you speak of?

And thanks :)

5. Nov 7, 2009

lanedance

actually for your case, you probably don't even need to look at subscripts... how about just taking the transpose of one side of the equation, and working otowrads the other, similar to what you did for the first direction...

Last edited: Nov 7, 2009
6. Nov 7, 2009

s_j_sawyer

Oh I think I got it.

Here's what I did:

Assuming x*Ay = Ax*y,

(x*Ay)
= (Ax)^T y (by assumption)
= x^T A^T y
= x* (A^T y)

So we have (x * Ay) = x * (A^T y),
which implies A = A^T
and so A is symmetric.

Does this seem correct?

7. Nov 7, 2009

D H

Staff Emeritus
You are mixing and matching notation, and this is getting you in trouble.

Suggested route: Any matrix can be written as a sum of a symmetric matrix and a skew symmetric matrix (prove this). With this, you can express A as A=B+C, where B is symmetric and C is skew symmetric. Given that [itex]x\cdot(Ay) = (Ax)\cdot y[/tex], show that C must be identically zero.

8. Nov 7, 2009

s_j_sawyer

Can you explain? I don't see what is wrong with what I did.

9. Nov 7, 2009

D H

Staff Emeritus
What you did wrong was to make the step from xTAy = xTATy to A = AT without justification. Each side of the first equality is a scalar involving nine terms. The second equality is in reality six distinct equations. That is a mighty big leap; you are going to need to justify it.

10. Nov 7, 2009

s_j_sawyer

So you're saying x*Ay = x*By does not imply A = B?

Also, I've proven that any square matrix can be written as a sum of a symmetric matrix and a skew symmetric matrix (i.e. A = B + C, where A is nxn and B is symmetric, C is skew symmetric) but I can't seem to prove C = 0 using my original assumption x*Ay = Ax*y. I subbed in A = (B + C) and using B = B^T, C = -C^T but I can't get anywhere with that. Any hints?

Btw this isn't a homework problem. We were just given the theorem to use and the proof was an "exercise." How nice.

11. Nov 7, 2009

D H

Staff Emeritus
Of course not. It's just a bit of a leap. Using Einstein sum notation, this says xiAijyj=xiBijyj for all x,y. So how to make this leap? One way is to choose x=er and y=es (unit vectors whose components are zero except for the rth and sth elements, respectively, which are one). With this choice the sum simplifies to Ars=Brs. Repeating this selection for all n2 combinations of r and s yields A=B.

Again using Einstein sum notation, one arrives at xi(Cij-Cji)yj=2xiCijyj=0. Using a similar selection mechanism as above, this says C=0.

12. Nov 7, 2009

lanedance

i think you guys are pretty much there, but thought i would add if you're a bit more careful with setting up your dot products as matrix multiplication, you can save some time...

so if x & y are real n x 1 column vectors, the dot product in matrix multiplication is
$$\textbf{x} \bullet \textbf{y} = \textbf{x}^T \textbf{y}$$

the ineqaulity is
$$\textbf{x} \bullet (A\textbf{y}) = (A\textbf{x}) \bullet \textbf{y}$$
in matrix multiplication
$$\textbf{x}^T (A\textbf{y}) = (A\textbf{x})^T \textbf{y}$$
which becomes (transpose & associativity)
$$\textbf{x}^T A\textbf{y} = \textbf{x}^T A^T \textbf{y}$$

Along with DHs comments i think thats pretty much it...

The subscript notation dsicussed with einstein summation is defintely worth learning & really makes life easier, particularly when for vector product idenitites and stuff, another tool in the toolbox & really saves time.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook