Linear Algebra - Singular Value Decomposition Problem

In summary, the conversation discusses finding the SVD of a given matrix and the use of orthonormal bases and singular values. The main question is why only the positive square root is considered when finding the singular values, and the answer is that it is defined as such but using the negative square root would still give the same original matrix.
  • #1
YoshiMoshi
226
8

Homework Statement



Find the SVD of

equation 1.PNG


Homework Equations

The Attempt at a Solution


I'm stuck
equation 2.PNG

equation 3.PNG

equation 4.PNG


My question is why in the solution it originally finds u_2=[1/5,-2/5]' but then says u_2=[1/sqrt(5),-2/sqrt(5)]'. I don't see what math was done in the solution to change the denominator from 5 to square root 5.

General Question - When finding the singular value...
(sigma_1)^2 = constant, why do we only consider the positive root
sigma_1 = sqrt(constant)
because the solution to the problem is
sigma_1 = +/- sqrt(constant)

Thanks for any help you can provide me.
 
Physics news on Phys.org
  • #2
YoshiMoshi said:
My question is why in the solution it originally finds u_2=[1/5,-2/5]' but then says u_2=[1/sqrt(5),-2/sqrt(5)]'. I don't see what math was done in the solution to change the denominator from 5 to square root 5.
You're looking for an orthonormal basis.

General Question - When finding the singular value...
(sigma_1)^2 = constant, why do we only consider the positive root
sigma_1 = sqrt(constant)
because the solution to the problem is
sigma_1 = +/- sqrt(constant)

Thanks for any help you can provide me.
What's the definition you're using of a singular value?
 
  • #3
Sorry I don't understand. It has to be orthornormal to u_1 so taking the dot product with u_1 and u_2 has to be zero and that's were it comes from?

I'm talking about just when it finds sigma_1 and sigma_2 why don't we consider the negative square root into our calculation when we find SVD? Like when we form the sigma matrix it's the singular values in a diagonal matrix, so I just don't understand really why we don't consider the negative root.
 
  • #4
YoshiMoshi said:
Sorry I don't understand. It has to be orthornormal to u_1 so taking the dot product with u_1 and u_2 has to be zero and that's were it comes from?
No.

I'm talking about just when it finds sigma_1 and sigma_2 why don't we consider the negative square root into our calculation when we find SVD? Like when we form the sigma matrix it's the singular values in a diagonal matrix, so I just don't understand really why we don't consider the negative root.
Look up the definition of a singular value..
 
  • #5
YoshiMoshi said:
why don't we consider the negative square root into our calculation when we find SVD?
Singular values of a ##A## is defined to be the positive square root of the eigenvalues of ##A^*A##.
Nevertheless, if you choose to use the negative ones, it will still give the same original matrix ##A##.
 

1. What is the purpose of Singular Value Decomposition (SVD) in Linear Algebra?

SVD is a useful tool in linear algebra for decomposing a matrix into three simpler matrices, which can then be used for solving systems of linear equations, data compression, and other applications. It is particularly useful for finding the "best" approximation of a matrix, as it can identify the most important features or patterns in the data.

2. How is SVD different from other matrix factorization techniques?

SVD is unique in that it can be applied to any type of matrix, including rectangular and non-square matrices. It also provides a unique factorization, as the decomposed matrices are always orthogonal. This means that SVD can handle matrices with complex or imaginary values, unlike other factorization methods.

3. What is the significance of the singular values in SVD?

The singular values in SVD represent the importance or significance of each column or row in the original matrix. They can be used to determine the rank of a matrix, as well as identify the most important features or patterns in the data. The larger a singular value, the more important its corresponding column or row is in the original matrix.

4. How is SVD used in data compression?

SVD can be used for data compression by reducing the dimensionality of a dataset while preserving important information. This is achieved by keeping only the most significant singular values and corresponding columns/rows in the decomposed matrices. The compressed data can then be used for analysis or storage, without losing too much information.

5. Can SVD be used for solving systems of linear equations?

Yes, SVD can be used for solving systems of linear equations by rewriting the original matrix equation as a product of the decomposed matrices. This makes it easier to find the solution, as the decomposed matrices have special properties that can simplify the calculations. However, SVD is not always the most efficient method for solving systems of equations and should be used in combination with other techniques for optimal results.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
14
Views
401
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
960
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
531
  • Calculus and Beyond Homework Help
Replies
24
Views
805
  • Calculus and Beyond Homework Help
Replies
2
Views
750
  • Calculus and Beyond Homework Help
Replies
1
Views
298
Back
Top