Stuck on singular value decomposition problem

In summary, the process for finding a singular value decomposition of A requires finding the eigenvalues and eigenvectors of A^T*A, using them to determine the singular values and constructing the matrices U and V. U should be orthogonal to both U1 and U2, which can be found using the Gram-Schmidt method. The columns of U and V should form an orthonormal basis, making both matrices unitary.
  • #1
SpiffyEh
194
0

Homework Statement



Find a singular value decomposition of A.
A^T=
[7 0 5
1 0 5]

Homework Equations



A = U[tex]\Sigma[/tex]V^T

The Attempt at a Solution


I started by doing A^T*A =
[ 74 32
32 26]

Then i went and found the two eigen values lambda1= 90 and lambda2= 10 and the eigenvectors v1 = [2 1]^T and v2 = [-1 2]^T
So, I have V and V^T

From this the singular values are sigma_1 = sqrt(90) and sigma_2 = sqrt(10)
So, [tex]\Sigma[/tex] in this decomposition would be
[ sqrt(90) 0
0 sqrt(10)
0 0]

Now to figure out U.
u_1 = 1/sigma_1 AV1 which is
= [ 15/sqrt(90) 0 15/sqrt(90)]^T
and I did the same thing for u_2 to get
[-5/sqrt(10) 0 5/sqrt(10)]

Now, this is where I get stuck. I know I need U to be 3x3 for the matrix multiplication to work out. My book says to find an orthogonal vector and use the gramschmidt method to get u_3. Do I need it to be orthogonal to u_1 or u_2? or both? Also I can't figure out the gramschmidt. If someone could please clarify this for me that would really help. I'm so close (if what I already did is correct) but I can't figure it out.
 
Physics news on Phys.org
  • #3
thank you, i'll try that
 

1. What is singular value decomposition (SVD)?

Singular value decomposition is a matrix decomposition method used in linear algebra. It decomposes a matrix into three matrices - U, Σ, and V - where U and V are orthogonal matrices and Σ is a diagonal matrix. SVD is commonly used in data analysis and machine learning for dimensionality reduction, data compression, and feature extraction.

2. Why is SVD important?

SVD is important because it allows us to break down a complex matrix into simpler components, making it easier to analyze and manipulate. It is also a useful tool for reducing the dimensionality of data while preserving important information, which can improve the performance of machine learning algorithms.

3. What is the relationship between SVD and PCA?

SVD and principal component analysis (PCA) are closely related. SVD is used to find the principal components (eigenvectors) of a dataset, while PCA uses these components to reduce the dimensionality of the data. SVD is also used in PCA for calculating the variance explained by each principal component.

4. How is SVD calculated?

The calculation of SVD involves a series of matrix operations and can be done using various algorithms, such as the Golub-Reinsch algorithm or the Jacobi algorithm. The process involves finding the eigenvalues and eigenvectors of the matrix, which are then used to construct the three component matrices - U, Σ, and V.

5. What are the applications of SVD?

SVD has numerous applications in various fields, including data analysis, machine learning, signal processing, image compression, and recommendation systems. It is used for dimensionality reduction, feature extraction, noise reduction, and data visualization. SVD is also used in collaborative filtering algorithms for making personalized recommendations.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
738
  • Special and General Relativity
3
Replies
78
Views
4K
  • Introductory Physics Homework Help
Replies
17
Views
1K
Replies
13
Views
3K
  • Calculus and Beyond Homework Help
Replies
31
Views
3K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
663
Back
Top