Why are all the singular values of A equal to 1?

  • Thread starter rpthomps
  • Start date
  • Tags
    Bases Svd
In summary: Did it?In summary, the goal of the conversation was to find the matrix A that transforms one orthonormal basis to another. Through the conversation, it was determined that A=UV^T where U and V are rotation matrices. This led to the realization that the singular values of A must be 1, as the basis vectors U and V are always orthonormal. The definition of a singular value was discussed, with the conclusion that it is similar to an eigenvalue and can be found by calculating the eigenvalues of the operator |A|, defined by |A| = √(A^*A).
  • #1
rpthomps
182
19
Problem:

Suppose u1,...un and v1,...vn are orthonormal bases for Rn. Construct the matrix A that transforms each vj into uj to give Av1=u1,...Avn=un.

Answer key says A=UV^T since all σj=1. Why is all σj=1?
 
Physics news on Phys.org
  • #2
What does the fact that the two bases are orthonormal tell you about A? Can you use the answer to that somehow?
 
Last edited:
  • Like
Likes rpthomps
  • #3
I regrouped and thought about this...

if A=UΣVT then AV=(UΣVT)V = UΣ

If Σ=I then AV=U and σ=1.

I think that is it. Thoughts? Much thanks.
 
  • #4
This is correct, but your argument looks kind of circular. It looks like you had been told that ##\Sigma=I## because all the ##\sigma_i## are 1, and wanted to know why all the ##\sigma_i## are all 1. Now you're saying that they're all 1 because ##\Sigma=I##. You can use ##\sigma_i=1## to explain ##\Sigma=I##, or you can use ##\Sigma=I## to explain ##\sigma_i=1##, but you can't do both.

You're using a theorem that tells you that ##A=U\Sigma V^T##, and also defines ##U##, ##\Sigma## and ##V##, right? How does that theorem define ##\Sigma##? If it defines it as a diagonal matrix with the singular values of ##A## (=eigenvalues of ##\sqrt{A^*A}##) on the diagonal, then I would recommend that you ignore the theorem until you have figured out what the problem statement is telling you about A (and about ##A^*A##).

By the way, the appropriate place for a question about a textbook-style problem about linear algebra is the calculus & beyond homework forum.
 
Last edited:
  • Like
Likes rpthomps
  • #5
Sorry about the misplaced post. I wasn't working with the idea that Σ=I and thus σ=1 or vice-versa. I was working with the idea that AV has to equal U and the only way that could occur is if Σ=I. Even so, I accept your challenge of trying to dig deeper into the problem and so far, I have come up with the following:

Av1=u1 means that u1 is a linear combination of the columns of A

but u1 is part of a basis set, so by definition it cannot be made up of other bases. Therefore, A is a projection matrix. (Not sure about this, got to think about it a little more...just thought I would include my thoughts.)
 
  • #6
rpthomps said:
Av1=u1 means that u1 is a linear combination of the columns of A
It does, but I don't think this will help you find the property of A that I had in mind.

rpthomps said:
but u1 is part of a basis set, so by definition it cannot be made up of other bases. Therefore, A is a projection matrix. (Not sure about this, got to think about it a little more...just thought I would include my thoughts.)
This is not correct. In fact, the only projection that takes an orthonormal basis to another is the identity map.

I'll give you another (small) hint: Start by writing down the formula that says that ##\{u_i\}## is orthonormal.
 
  • #7
Fredrik said:
It does, but I don't think this will help you find the property of A that I had in mind.This is not correct. In fact, the only projection that takes an orthonormal basis to another is the identity map.

I'll give you another (small) hint: Start by writing down the formula that says that ##\{u_i\}## is orthonormal.

Okay, here is another kick at the can. If I want a matrix A that takes V and transforms it into U, such that AV=U where V and U are matrices whose columns are the orthonormal vectors ui,...un and vi,...vn respectively, then I can isolate for A by taking the inverse of V, so that A=UV-1 but because V is orthonormal then A=UVT. So, if U and V are rotation matrices, then A would be the combination of those rotation matrices. I checked it with some 2x2 matrices and it seemed to work. Thoughts?
 
  • #8
This is correct. (Not exactly the method I had in mind, but close enough. We can discuss my method later). Now what does ##A=UV^T## tell you about the singular values of ##A##? What is your book's definition of a singular value?
 
  • #9
Fredrik said:
This is correct. (Not exactly the method I had in mind, but close enough. We can discuss my method later). Now what does ##A=UV^T## tell you about the singular values of ##A##? What is your book's definition of a singular value?

Well, my book and other readings from the internet seem to suggest that a singular value is similar to an eigenvalue. However, a singular value is a value which multiple an orthonormal basis to get the product of another orthonormal basis and a matrix A. ##A=UV^T## seems to suggest the singular value is 1. Question: Is the singular value always 1 in a SVD? It seems to be the case because the basis vectors U and V are always orthonormal.
 
  • #10
According to "Linear algebra done wrong" by Sergei Treil (which can be downloaded for free online), the singular values of ##A## are the eigenvalues of the operator ##|A|##, defined by ##|A|=\sqrt{A^*A}##, where ##A^*## is the adjoint of ##A##. Knowing this makes it very easy to find the singular values in your problem (where ##A## takes one orthonormal basis to another).

If this isn't how your book defines them, then it must have provided some other definition, or at least a way to calculate them.
 
Last edited:

What is SVD and how is it used in scientific research?

SVD stands for Singular Value Decomposition and it is a mathematical technique that decomposes a matrix into three components: a left singular matrix, a diagonal matrix of singular values, and a right singular matrix. SVD is commonly used in scientific research to reduce the dimensionality of data and to extract important features from large datasets.

What is an orthonormal basis and why is it important in linear algebra?

An orthonormal basis is a set of vectors that are mutually perpendicular and have a length of 1 unit. It is important in linear algebra because it simplifies the process of solving equations and performing calculations. An orthonormal basis also allows for easy transformation of vectors and matrices.

What is the relationship between SVD and orthonormal bases?

SVD and orthonormal bases are closely related. In fact, SVD can be thought of as finding the orthonormal bases for the left and right singular matrices. The singular values in SVD represent the length of the basis vectors, and the left and right singular matrices represent the direction of the basis vectors.

How does SVD help in data compression and reconstruction?

SVD is useful in data compression because it can decompose a large matrix into smaller matrices with fewer dimensions. This reduces the amount of space needed to store the data without losing important information. SVD can also be used to reconstruct the original data from the compressed form, making it a useful tool in data compression and storage.

What are some applications of SVD and orthonormal bases in scientific fields?

SVD and orthonormal bases have a wide range of applications in scientific fields such as image and signal processing, data mining, and machine learning. They are also used in areas such as genetics, neuroscience, and bioinformatics to analyze and interpret large datasets. In addition, SVD and orthonormal bases are used in engineering for tasks such as noise reduction and system identification.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
3K
  • Linear and Abstract Algebra
Replies
9
Views
2K
Replies
4
Views
3K
  • General Math
Replies
22
Views
3K
Replies
4
Views
1K
  • STEM Academic Advising
Replies
13
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
2K
Back
Top