Taylor expansion for matrix logarithm

Click For Summary
The discussion focuses on the Taylor expansion of the logarithm of positive hermitian matrices A and B, specifically the formula for log(A+tB) at t=0. A user seeks clarification on the derivation of this formula, which includes an integral term and lacks a source or proof. Another participant suggests an alternative approach using the property of logarithms, but notes that the original formula may not hold unless matrices A and B commute. The conversation emphasizes the need for a reliable derivation or reference for the initial claim. The topic remains centered on the mathematical properties and derivations related to matrix logarithms.
Backpacker
Messages
9
Reaction score
0
A paper I'm reading states the that: for positive hermitian matrices A and B, the Taylor expansion of \log(A+tB) at t=0 is

\log(A+tB)=\log(A) + t\int_0^\infty \frac{1}{B+zI}A \frac{1}{B+zI} dz + \mathcal{O}(t^2).

However, there is no source or proof given, and I cannot seem to find a derivation of this identity anywhere! Any help would be appreciated. Thanks.
 
Last edited:
Physics news on Phys.org
Backpacker said:
A paper I'm reading states the that: for positive hermitian matrices A and B, the Taylor expansion of \log(A+tB) at t=0 is

\log(A+tB)=\log(A) + t\int_0^\infty \frac{1}{B+zI}A \frac{1}{B+zI} dz + \mathcal{O}(t^2).

However, there is no source or proof given, and I cannot seem to find a derivation of this identity anywhere! Any help would be appreciated. Thanks.

Welcome to PF, Backpacker! :smile:

I don't recognize your formula, but:

$$\log(A+tB)=\log(A(I+tA^{-1}B)= \log A + \log(I+tA^{-1}B) = \log A + tA^{-1}B + \mathcal{O}(t^2)$$
 
I like Serena said:
$$\log(A(I+tA^{-1}B)= \log A + \log(I+tA^{-1}B) $$
This doesn't seem quite right, unless ## A ## and ## B ## commute.
 
There are probably loads of proofs of this online, but I do not want to cheat. Here is my attempt: Convexity says that $$f(\lambda a + (1-\lambda)b) \leq \lambda f(a) + (1-\lambda) f(b)$$ $$f(b + \lambda(a-b)) \leq f(b) + \lambda (f(a) - f(b))$$ We know from the intermediate value theorem that there exists a ##c \in (b,a)## such that $$\frac{f(a) - f(b)}{a-b} = f'(c).$$ Hence $$f(b + \lambda(a-b)) \leq f(b) + \lambda (a - b) f'(c))$$ $$\frac{f(b + \lambda(a-b)) - f(b)}{\lambda(a-b)}...

Similar threads

  • · Replies 24 ·
Replies
24
Views
12K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
12K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K