Derivative of SVD V and U matrices

In summary, the conversation discusses the use of a rotation matrix to describe the relationship between two sets of atomic coordinates. The rotation matrix is defined using the singular value decomposition of matrix K, which is formed by combining the coordinate vectors of each atom. The conversation also mentions the need for the derivative of each element of the rotation matrix in order to calculate analytic forces. A paper on this topic is provided as a helpful resource.
  • #1
saulg
3
0
sorry I am new and posted instead of previewing...im currently writing the post
 
Physics news on Phys.org
  • #2
I find a rotation matrix which best describes how to get from one set of atomic coordinates (molecular geometry) to another by just a pure rotation.

The rotation matrix R is defined

[tex]R=V \left( \begin{array}{ccc}
1 & & \\
& 1 & \\
& & \left|VU^T\right| \end{array} \right)U^T
[/tex]

where V and U are from the SVD of matrix K:

[tex]K=V\Lambda U^T[\tex]

K is formed by summing over i the outer products of the coordinate vectors of atom i in the first and second geometry. (i follow the method described at http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/MARBLE/high/pose/least.htm" )


I.e K depends on atomic coordinates, and so do U and V.

I require the derivative of each element of R with respect to atomic coordinate. (The rotation matrix is used in an energy approximation and I need analytic forces)


Any answers or hints much appreciated.
 
Last edited by a moderator:

1. What is the purpose of finding the derivatives of the SVD V and U matrices?

The derivatives of the SVD V and U matrices are used to understand how small changes in the input data affect the output of the singular value decomposition (SVD) algorithm. This can help in optimizing the SVD algorithm and improving its performance.

2. How are the derivatives of the SVD V and U matrices calculated?

The derivatives of the SVD V and U matrices are calculated using the chain rule and the derivatives of the individual components of the SVD decomposition, such as the singular values and the left and right singular vectors.

3. Can the derivatives of the SVD V and U matrices be negative?

Yes, the derivatives of the SVD V and U matrices can be negative. This indicates that a small change in the input data results in a decrease in the output of the SVD algorithm. It is important to consider both positive and negative derivatives in order to fully understand the behavior of the SVD algorithm.

4. How can the derivatives of the SVD V and U matrices be used in machine learning?

The derivatives of the SVD V and U matrices can be used in machine learning tasks such as dimensionality reduction and data compression. They can also be used in optimization algorithms for training machine learning models.

5. Are there any limitations or assumptions when calculating the derivatives of the SVD V and U matrices?

Yes, there are some limitations and assumptions when calculating the derivatives of the SVD V and U matrices. These include assuming that the input data is continuous and differentiable, and that the SVD algorithm is stable and well-behaved.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
696
  • Linear and Abstract Algebra
Replies
1
Views
2K
Replies
1
Views
1K
Back
Top