Integrate Matrix: Can Matrices Have Antiderivatives?

  • Thread starter Thread starter daudaudaudau
  • Start date Start date
  • Tags Tags
    Integrate Matrix
daudaudaudau
Messages
297
Reaction score
0
Hi. Is it possible to calculate the antiderivative of
<br /> \frac{1}{Ax+B}<br />

if A and B are matrices? If they were scalars the result would be 1/A \log(Ax+B).
 
Physics news on Phys.org
Well... i may be talking complete nonsense here, but... why not? After one of my professors blew my mind by saying that it was possible to calculate exponentials, sines, etc, of matrixes, i believe anything is possible in math...

His calculations were like this:

e^x=1+x+\frac{1}{2!}x^2+\frac{1}{3!}x^3+\cdots

So, if M is a matrix and I is the identity matrix:

e^M=I+M+\frac{1}{2!}(M.M)+\frac{1}{3!}(M.M.M)+\cdots

 
Nearly! As coelho says, a lot of things carry over from real and complex arithmetic and calculus to apply to matrices, BUT you have to be extremely careful.

One reason for care is that matrices do not necessarily commute with one another, so it is important what order you write things down in. For example, if we differentiate 1/f(x) for f a real or complex function of x, we get -f'(x)/(f(x)2), and the order of things is irrelevant. But if f is a matrix-valued function, only one order will do: f'(x)= -f(x)-1 f'(x) f(x)-1 (You can check this by differentiating f(x)f(x)-1=I). Similar things apply for differentiating things like exponentials, for which you get one of my favourite formulae; very nontrivial and yet so easy to remember:

\frac{\mathrm{d}}{\mathrm{d}x}\exp(f(x)) = \int_0^1 \exp(t f(x)) f&#039;(x) \exp((1-t) f(x)) \mathrm{d}t

The other problem you have to contend with is multivaluedness. log, as definied as an inverse to exp, is already multivalued for real and complex arguments but the problem gets worse for matrices. The wiki article on matrix logarirthm is pretty good so look there for details.

Now I've got these caveats out the way, I'll give you an answer. I only did a quick calculation where I assumed A-1B exists and is diagonal, but with more care I'm sure it can be proven in more generality. I got an antiderivative as log(x+A-1B) A-1=A-1log(x+BA-1). This is certainly a form we recognise from ordinary real/complex numbers. Bear in mind that stuff like log(ab)=log(a)+log(b) doesn't necessarily carry through into the complex case.
 
That's very interesting. Thank you both.
 
Yes, it is possible to calculate e^x, ln(x), sin(x), etc. for x any object that you can add and multiply.

Just use the Taylor's series for the function.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top