Derivative of Log Determinant of a Matrix w.r.t a parameter

In summary, the conversation discusses a theorem concerning the derivative of the log of the determinant of a symmetric matrix. The theorem is stated and the steps towards proving it are shown, but there is difficulty in getting rid of the second term at the end. Ultimately, the conversation concludes that the theorem is true and can be proven using the partial derivative formula for the log of the determinant of a matrix.
  • #1
CuppoJava
24
0
Hi,
I'm trying to see why the following theorem is true. It concerns the derivative of the log of the determinant of a symmetric matrix.

Here's the theorem as stated:

For a symmetric matrix A:
[tex]\frac{d}{dx} ln |A| = Tr[A^{-1} \frac{dA}{dx}][/tex]

Here's what I have so far, I'm almost at the answer, except I can't get rid of the second term at the end:

[tex]A = \sum_{i} \lambda_{i} u_{i} u_{i}^{T}[/tex]
[tex]A^{-1} = \sum_{i} \frac{1}{\lambda_{i}} u_{i} u_{i}^{T}[/tex]

So
[tex]A^{-1} \frac{dA}{dx} = \sum_{i} \frac{1}{\lambda_{i}} u_{i} u_{i}^{T} \frac{d}{dx}(\sum_{j}\lambda_{j} u_{j} u_{j}^{T})
=\sum_{i}\sum_{j}\frac{1}{\lambda_{i}}\frac{d\lambda_{j}}{dx}u_{i} u_{i}^{T}u_{j} u_{j}^{T} + \sum_{i}\sum_{j}\frac{\lambda_{j}}{\lambda_{i}}u_{i} u_{i}^{T}\frac{d}{dx}u_{j} u_{j}^{T}
=\sum_{i}\frac{1}{\lambda_{i}}\frac{d\lambda_{j}}{dx}u_{i} u_{i}^{T} + \sum_{i}\sum_{j}\frac{\lambda_{j}}{\lambda_{i}}u_{i} u_{i}^{T}\frac{d}{dx}u_{j} u_{j}^{T}[/tex]

And this would be just perfect if the second term was equal to zero. But I can't see how that could be made to happen.

Thanks a lot for your help
-Patrick
 
Physics news on Phys.org
  • #2
This theorem is true indeed, and doesn't even need A to be symmetric.

Using :
[tex] \frac{\partial}{\partial x} ln det A = \sum_{i,j} \frac{\partial a_{ij}}{x} \frac{\partial}{\partial a_{i,j}}[\tex]

with :
[tex]\frac{\partial }{\partial c_{ij}} ln det A = (A^{-1})_{ji}[\tex]

you get :
[tex] \frac{\partial}{\partial x} ln det A = Tr(A^{-1}\frac{\partial A}{\partial x}) = Tr(\frac{\partial A}{\partial x}A^{-1})[\tex]

I hope that will help...

Canag
 

1. What is the purpose of calculating the derivative of log determinant of a matrix with respect to a parameter?

The derivative of log determinant of a matrix with respect to a parameter is commonly used in machine learning and statistics to optimize the parameters of a model. It helps in finding the values of the parameters that maximize the likelihood of the data.

2. How is the derivative of log determinant of a matrix with respect to a parameter calculated?

The derivative of log determinant of a matrix with respect to a parameter can be calculated using the chain rule and the properties of matrix calculus. It can also be computed using software packages such as MATLAB or Python libraries such as NumPy and SciPy.

3. What is the significance of the log determinant of a matrix in machine learning?

The log determinant of a matrix is commonly used as a regularization term in machine learning models. It helps in improving the generalization ability of the model and prevents overfitting by penalizing large values of the parameters.

4. Can the derivative of log determinant of a matrix with respect to a parameter be negative?

Yes, the derivative of log determinant of a matrix with respect to a parameter can be negative. This means that increasing the value of the parameter will decrease the log determinant of the matrix. It is important to consider the sign of the derivative when using it for optimization purposes.

5. Are there any practical applications of the derivative of log determinant of a matrix with respect to a parameter?

Yes, the derivative of log determinant of a matrix with respect to a parameter has various practical applications. It is commonly used in fields such as machine learning, signal processing, and physics for parameter estimation, model selection, and dimensionality reduction.

Similar threads

  • Calculus and Beyond Homework Help
Replies
0
Views
131
Replies
5
Views
367
  • Special and General Relativity
Replies
5
Views
953
Replies
6
Views
922
Replies
5
Views
1K
  • Differential Equations
Replies
8
Views
3K
Replies
2
Views
1K
Replies
3
Views
1K
Replies
6
Views
2K
Replies
14
Views
2K
Back
Top