How to Prove the Logarithm Property for Matrices with Convergent Power Series?

  • Context: Undergrad 
  • Thread starter Thread starter Euge
  • Start date Start date
  • Tags Tags
    2016
Click For Summary
SUMMARY

The logarithm of an $n \times n$ matrix $A$ is defined using the power series $$\sum_{k = 1}^\infty \frac{(-1)^{k-1}(A - I)^k}{k}$$, which converges when $\|A - I\| < 1$. The discussion confirms that for all $n \times n$ matrices $A$ and $B$ satisfying $\|A - I\| < 1$, $\|B - I\| < 1$, and $\|AB - I\| < 1$, along with the commutativity condition $AB = BA$, the property $$\log(AB) = \log A + \log B$$ holds true. This result is crucial for understanding matrix logarithms in the context of convergent power series.

PREREQUISITES
  • Understanding of matrix norms, specifically the standard matrix norm.
  • Familiarity with power series and their convergence criteria.
  • Knowledge of matrix logarithms and their properties.
  • Basic concepts of linear algebra, including matrix multiplication and commutativity.
NEXT STEPS
  • Study the convergence of power series in the context of matrices.
  • Explore the properties of matrix logarithms in more depth.
  • Learn about the implications of matrix commutativity on logarithmic properties.
  • Investigate applications of matrix logarithms in systems of linear equations and differential equations.
USEFUL FOR

Mathematicians, students studying linear algebra, and researchers interested in advanced matrix theory and its applications in various fields such as control theory and quantum mechanics.

Euge
Gold Member
MHB
POTW Director
Messages
2,072
Reaction score
245
Here is this week's POTW:

-----
Define the logarithm of an $n\times n$ matrix $A$ by the power series

$$\sum_{k = 1}^\infty \frac{(-1)^{k-1}(A - I)^k}{k}$$

which converges for $\|A - I\| < 1$ (the standard matrix norm is being used here). Prove that for all $n\times n$ matrices $A$ and $B$ with $\|A - I\| < 1$, $\|B - I\| < 1$, $\|AB - I\| < 1$, and $AB = BA$,

$$\log(AB) = \log A + \log B$$

-----

Remember to read the http://www.mathhelpboards.com/showthread.php?772-Problem-of-the-Week-%28POTW%29-Procedure-and-Guidelines to find out how to http://www.mathhelpboards.com/forms.php?do=form&fid=2!
 
Physics news on Phys.org
This week's problem was solved correctly by Opalg. You can read his solution below.
Let $D = \{z\in\mathbb{C}: |z-1|<1\}$ and $\exp D = \{e^z:z\in D\}.$ Neither of these sets intersect the negative real axis, so the logarithm is defined as a holomorphic function on both sets, and satisfies the condition [math]\log z = \sum_{k=1}^\infty \frac{(-1)^{k-1}(z-1)^k}k[/math] for $z\in D.$The holomorphic functional calculus says that given an $n\times n$ matrix $T$ whose spectrum [set of eigenvalues] lies in some domain $U$, there is a continuous unital homomorphism $\phi_T$ from the algebra of holomorphic functions on $U$ (with the topology of uniform convergence on compact subsets) to the $n\times n$ matrices, with $\phi_T(z) = T.$ In particular, if $U = D$ or $U = \exp D$ then this mapping takes the logarithm function to a matrix $\log T$. The continuity of $\phi_T$ ensures that when $U = D$ this definition of $\log T$ agrees with that in the statement of the problem. Also, the exponential and logarithm functions are inverses of each other between the spaces $D$ and $\exp D$.Given $A$ and $B$ as in the problem, the conditions $\|A - I\|<1$ and $\|B - I\|<1$ ensure that the spectrum of each matrix lies in $D$. Let $X = \log A$ and $Y = \log B.$ Then $YX = XY$, because polynomials in $A$ and $B$ commute. Thus $$\exp(X+Y) = \sum_{n=0}^\infty \frac{(X+Y)^n}{n!} = \sum_{n=0}^\infty \sum_{k=0}^n \frac{X^kY^{n-k}}{k!(n-k)!} = \sum_{s=0}^\infty \frac{X^s}{s!} \sum_{t=0}^\infty \frac{Y^t}{t!} = \exp X \exp Y,$$ the change in order of summation being justified because of the uniform convergence of the series on compact sets.Therefore $\exp (\log A + \log B) = \exp(X+Y) = \exp X\exp Y = AB$. The condition $\|AB - I\|<1$ ensures that the spectrum of $AB\;(=\exp (\log A + \log B))$ is in $D$, so we can apply the logarithm function to both sides to conclude that $\log A + \log B = \log(AB)$.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K