Explain how any square matrix A can be written as

macaholic
Messages
21
Reaction score
0

Homework Statement


a) Explain how any square matrix A can be written as

A = QS
where Q is orthogonal and S is symmetric positive semidefinite.

b) Is it possible to write

A = S_1 Q_1
Where Q1 is orthogonal and S1 is symmetric positive definite?

Homework Equations



A = U \Sigma V^T

The Attempt at a Solution



For a) I've gotten to the point where I've written:

A = U V^T V \Sigma V^T

Which is just a rearrangement of the single value decomposition. From this I believe there is some logic as to why U V^T is orthogonal, and VΣV^T is symmetric positive definite, but I can't seem to figure out the reasoning. Any pointers?

For b) I've surmised this is possible given that it is simply the "left polar decomposition" (http://en.wikipedia.org/wiki/Polar_decomposition) But again, I can't think about how to show this mathematically.
 
Physics news on Phys.org
macaholic said:

Homework Statement


a) Explain how any square matrix A can be written as

A = QS
where Q is orthogonal and S is symmetric positive semidefinite.

b) Is it possible to write

A = S_1 Q_1
Where Q1 is orthogonal and S1 is symmetric positive definite?

Homework Equations



A = U \Sigma V^T

The Attempt at a Solution



For a) I've gotten to the point where I've written:

A = U V^T V \Sigma V^T

Which is just a rearrangement of the single value decomposition. From this I believe there is some logic as to why U V^T is orthogonal, and VΣV^T is symmetric positive definite, but I can't seem to figure out the reasoning. Any pointers?

For b) I've surmised this is possible given that it is simply the "left polar decomposition" (http://en.wikipedia.org/wiki/Polar_decomposition) But again, I can't think about how to show this mathematically.

Try and show the product of two orthogonal transformations is orthogonal. What's the definition of orthogonal? Ditto for the second part. What's the definition of positive semidefinite? Definitions are a great place to start.
 
Last edited:
Dick said:
Try and show the product of two orthogonal transformations is orthogonal. What's the definition of orthogonal? Ditto for the second part. What's the definition of positive semidefinite? Definitions are a great place to start.
Okay well can I start by saying U and V^T are both orthogonal? I think they are, right? If so, does this work?:
(UV^T)^T=VU^T
(UV^T)((UV^T)^T) = UV^T VU^T = UIU^T=UU^T=I

I'm not sure what to do about a good definition for positive semidefinite :/. The only one I know is that the eigenvalues are all 0 or positive. How can I use that on the second series of matrices?
 
macaholic said:
Okay well can I start by saying U and V^T are both orthogonal? I think they are, right? If so, does this work?:
(UV^T)^T=VU^T
(UV^T)((UV^T)^T) = UV^T VU^T = UIU^T=UU^T=I

I'm not sure what to do about a good definition for positive semidefinite :/. The only one I know is that the eigenvalues are all 0 or positive. How can I use that on the second series of matrices?

Yes, that shows your first factor is orthogonal. For the second one the definition I'm thinking of is that a matrix Q is positive semidefinite if x^TQx>=0 for all vectors x.
 
Dick said:
Yes, that shows your first factor is orthogonal. For the second one the definition I'm thinking of is that a matrix Q is positive semidefinite if x^TQx>=0 for all vectors x.

So I have to show that x^T(V\Sigma V^T x) \geq 0?
hmm... I'm not sure how to show this.
I know \Sigma has all diagonal entries... and I know that V, V^T are orthogonal. I can't seem to think of how that helps me though :/
 
macaholic said:
So I have to show that x^T(V\Sigma V^T x) \geq 0?
hmm... I'm not sure how to show this.
I know \Sigma has all diagonal entries... and I know that V, V^T are orthogonal. I can't seem to think of how that helps me though :/

The singular value decomposition tells you can pick \Sigma to be positive semidefinite, since it has all nonnegative entries along the diagonal. Try to use the definition I gave you.
 
Oh, right. So it would become a matter of showing that the V\Sigma V^T yields a matrix that's also positive semidefinite?
 
macaholic said:
Oh, right. So it would become a matter of showing that the V\Sigma V^T yields a matrix that's also positive semidefinite?

That would be correct.
 
Well I don't know how to show this, but I have a feeling that the product of an orthogonal matrix and a positive semidefinite matrix is positive semi definite... But alas I can't think of a way to reason this out. I'm no good at these matrix manipulation proofs it seems.
 
  • #10
macaholic said:
Well I don't know how to show this, but I have a feeling that the product of an orthogonal matrix and a positive semidefinite matrix is positive semi definite... But alas I can't think of a way to reason this out. I'm no good at these matrix manipulation proofs it seems.

You want to show x^T(VΣV^T)x≥0. Regroup that. I think you can do it. (V^T)x is just 'any vector'.
 
  • #11
OH wait I think I may have something...
I know that V\Sigma V^T = \Sigma (I can't figure how to show this though...)
So that makes my inequality x^T(\Sigma x) \geq 0
Was that at least one of the right steps? Though I can't justify the first part...
 
  • #12
macaholic said:
OH wait I think I may have something...
I know that V\Sigma V^T = \Sigma (I can't figure how to show this though...)
So that makes my inequality x^T(\Sigma x) \geq 0
Was that at least one of the right steps? Though I can't justify the first part...

No, they aren't equal. But if x^T(\Sigma x) \geq 0 is true for ANY x, then it must also be true if you replace x with V^T x, right?
 
  • #13
macaholic said:
Okay well can I start by saying U and V^T are both orthogonal? I think they are, right? If so, does this work?:
(UV^T)^T=VU^T
(UV^T)((UV^T)^T) = UV^T VU^T = UIU^T=UU^T=I

I'm not sure what to do about a good definition for positive semidefinite :/. The only one I know is that the eigenvalues are all 0 or positive. How can I use that on the second series of matrices?

The _definition_ of positive semidefinite is that x^T A x >= 0 for any x in R^n. One *consequence* of that is: a symmetric matrix is positive semidefinite if and only if its eigenvalues are >= 0. This is a theorem, not a definition.
 
Back
Top