Why are all the singular values of A equal to 1?

  • Thread starter Thread starter rpthomps
  • Start date Start date
  • Tags Tags
    Bases Svd
Click For Summary

Homework Help Overview

The discussion revolves around understanding the singular values of a matrix A that transforms one orthonormal basis into another in Rn. The original poster questions why all singular values of A are equal to 1, given that A is constructed from orthonormal bases.

Discussion Character

  • Conceptual clarification, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • Participants explore the implications of orthonormal bases on the properties of matrix A. There are attempts to connect the singular value decomposition (SVD) of A to the identity matrix, with some questioning the circular reasoning in the arguments presented. Others suggest that A could be a projection matrix, while also considering the implications of A being a transformation between bases.

Discussion Status

The discussion is ongoing, with participants providing insights and challenging each other's reasoning. Some guidance has been offered regarding the properties of orthonormal bases and the implications for singular values, but no consensus has been reached on the underlying reasons for the singular values being equal to 1.

Contextual Notes

Participants are navigating the definitions and properties of singular values and orthonormal bases, with references to textbook definitions and theorems. There is an acknowledgment of potential confusion regarding the relationship between the singular values and the structure of matrix A.

rpthomps
Messages
182
Reaction score
19
Problem:

Suppose u1,...un and v1,...vn are orthonormal bases for Rn. Construct the matrix A that transforms each vj into uj to give Av1=u1,...Avn=un.

Answer key says A=UV^T since all σj=1. Why is all σj=1?
 
Physics news on Phys.org
What does the fact that the two bases are orthonormal tell you about A? Can you use the answer to that somehow?
 
Last edited:
  • Like
Likes   Reactions: rpthomps
I regrouped and thought about this...

if A=UΣVT then AV=(UΣVT)V = UΣ

If Σ=I then AV=U and σ=1.

I think that is it. Thoughts? Much thanks.
 
This is correct, but your argument looks kind of circular. It looks like you had been told that ##\Sigma=I## because all the ##\sigma_i## are 1, and wanted to know why all the ##\sigma_i## are all 1. Now you're saying that they're all 1 because ##\Sigma=I##. You can use ##\sigma_i=1## to explain ##\Sigma=I##, or you can use ##\Sigma=I## to explain ##\sigma_i=1##, but you can't do both.

You're using a theorem that tells you that ##A=U\Sigma V^T##, and also defines ##U##, ##\Sigma## and ##V##, right? How does that theorem define ##\Sigma##? If it defines it as a diagonal matrix with the singular values of ##A## (=eigenvalues of ##\sqrt{A^*A}##) on the diagonal, then I would recommend that you ignore the theorem until you have figured out what the problem statement is telling you about A (and about ##A^*A##).

By the way, the appropriate place for a question about a textbook-style problem about linear algebra is the calculus & beyond homework forum.
 
Last edited:
  • Like
Likes   Reactions: rpthomps
Sorry about the misplaced post. I wasn't working with the idea that Σ=I and thus σ=1 or vice-versa. I was working with the idea that AV has to equal U and the only way that could occur is if Σ=I. Even so, I accept your challenge of trying to dig deeper into the problem and so far, I have come up with the following:

Av1=u1 means that u1 is a linear combination of the columns of A

but u1 is part of a basis set, so by definition it cannot be made up of other bases. Therefore, A is a projection matrix. (Not sure about this, got to think about it a little more...just thought I would include my thoughts.)
 
rpthomps said:
Av1=u1 means that u1 is a linear combination of the columns of A
It does, but I don't think this will help you find the property of A that I had in mind.

rpthomps said:
but u1 is part of a basis set, so by definition it cannot be made up of other bases. Therefore, A is a projection matrix. (Not sure about this, got to think about it a little more...just thought I would include my thoughts.)
This is not correct. In fact, the only projection that takes an orthonormal basis to another is the identity map.

I'll give you another (small) hint: Start by writing down the formula that says that ##\{u_i\}## is orthonormal.
 
Fredrik said:
It does, but I don't think this will help you find the property of A that I had in mind.This is not correct. In fact, the only projection that takes an orthonormal basis to another is the identity map.

I'll give you another (small) hint: Start by writing down the formula that says that ##\{u_i\}## is orthonormal.

Okay, here is another kick at the can. If I want a matrix A that takes V and transforms it into U, such that AV=U where V and U are matrices whose columns are the orthonormal vectors ui,...un and vi,...vn respectively, then I can isolate for A by taking the inverse of V, so that A=UV-1 but because V is orthonormal then A=UVT. So, if U and V are rotation matrices, then A would be the combination of those rotation matrices. I checked it with some 2x2 matrices and it seemed to work. Thoughts?
 
This is correct. (Not exactly the method I had in mind, but close enough. We can discuss my method later). Now what does ##A=UV^T## tell you about the singular values of ##A##? What is your book's definition of a singular value?
 
Fredrik said:
This is correct. (Not exactly the method I had in mind, but close enough. We can discuss my method later). Now what does ##A=UV^T## tell you about the singular values of ##A##? What is your book's definition of a singular value?

Well, my book and other readings from the internet seem to suggest that a singular value is similar to an eigenvalue. However, a singular value is a value which multiple an orthonormal basis to get the product of another orthonormal basis and a matrix A. ##A=UV^T## seems to suggest the singular value is 1. Question: Is the singular value always 1 in a SVD? It seems to be the case because the basis vectors U and V are always orthonormal.
 
  • #10
According to "Linear algebra done wrong" by Sergei Treil (which can be downloaded for free online), the singular values of ##A## are the eigenvalues of the operator ##|A|##, defined by ##|A|=\sqrt{A^*A}##, where ##A^*## is the adjoint of ##A##. Knowing this makes it very easy to find the singular values in your problem (where ##A## takes one orthonormal basis to another).

If this isn't how your book defines them, then it must have provided some other definition, or at least a way to calculate them.
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
5
Views
5K
  • · Replies 22 ·
Replies
22
Views
5K