Integrate Matrix: Can Matrices Have Antiderivatives?

  • Context: Graduate 
  • Thread starter Thread starter daudaudaudau
  • Start date Start date
  • Tags Tags
    Integrate Matrix
Click For Summary

Discussion Overview

The discussion revolves around the possibility of calculating the antiderivative of the expression \(\frac{1}{Ax+B}\) where \(A\) and \(B\) are matrices. Participants explore the implications of matrix operations on calculus concepts, particularly focusing on antiderivatives and related functions like exponentials and logarithms.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant questions whether it is possible to find the antiderivative of \(\frac{1}{Ax+B}\) when \(A\) and \(B\) are matrices, referencing the scalar case as a comparison.
  • Another participant suggests that if exponentials and trigonometric functions can be calculated for matrices, then it may also be possible to find antiderivatives, citing the Taylor series expansion as a basis.
  • A third participant emphasizes the need for caution when applying calculus to matrices, noting that the non-commutative nature of matrices affects differentiation and integration processes.
  • This participant also provides a specific formula related to the differentiation of matrix exponentials and mentions the complications introduced by multivaluedness in functions like logarithms when extended to matrices.
  • One participant acknowledges the previous contributions and expresses interest in the topic.
  • Another participant reiterates that functions like \(e^x\), \(\ln(x)\), and \(\sin(x)\) can be computed for any object that can be added and multiplied, again referencing Taylor series.

Areas of Agreement / Disagreement

Participants express a range of views on the topic, with some supporting the idea that antiderivatives can be computed for matrices while others highlight the complexities and potential pitfalls involved. No consensus is reached regarding the specific methods or results.

Contextual Notes

Participants note limitations such as the non-commutative property of matrices, the multivalued nature of logarithmic functions, and the need for careful handling of matrix operations in calculus.

daudaudaudau
Messages
297
Reaction score
0
Hi. Is it possible to calculate the antiderivative of
[tex] \frac{1}{Ax+B}[/tex]

if A and B are matrices? If they were scalars the result would be [itex]1/A \log(Ax+B)[/itex].
 
Physics news on Phys.org
Well... i may be talking complete nonsense here, but... why not? After one of my professors blew my mind by saying that it was possible to calculate exponentials, sines, etc, of matrixes, i believe anything is possible in math...

His calculations were like this:

[tex]e^x=1+x+\frac{1}{2!}x^2+\frac{1}{3!}x^3+\cdots[/tex]

So, if M is a matrix and I is the identity matrix:

[tex]e^M=I+M+\frac{1}{2!}(M.M)+\frac{1}{3!}(M.M.M)+\cdots[/tex]

 
Nearly! As coelho says, a lot of things carry over from real and complex arithmetic and calculus to apply to matrices, BUT you have to be extremely careful.

One reason for care is that matrices do not necessarily commute with one another, so it is important what order you write things down in. For example, if we differentiate 1/f(x) for f a real or complex function of x, we get -f'(x)/(f(x)2), and the order of things is irrelevant. But if f is a matrix-valued function, only one order will do: f'(x)= -f(x)-1 f'(x) f(x)-1 (You can check this by differentiating f(x)f(x)-1=I). Similar things apply for differentiating things like exponentials, for which you get one of my favourite formulae; very nontrivial and yet so easy to remember:

[tex]\frac{\mathrm{d}}{\mathrm{d}x}\exp(f(x)) = \int_0^1 \exp(t f(x)) f'(x) \exp((1-t) f(x)) \mathrm{d}t[/tex]

The other problem you have to contend with is multivaluedness. log, as definied as an inverse to exp, is already multivalued for real and complex arguments but the problem gets worse for matrices. The wiki article on matrix logarirthm is pretty good so look there for details.

Now I've got these caveats out the way, I'll give you an answer. I only did a quick calculation where I assumed A-1B exists and is diagonal, but with more care I'm sure it can be proven in more generality. I got an antiderivative as log(x+A-1B) A-1=A-1log(x+BA-1). This is certainly a form we recognise from ordinary real/complex numbers. Bear in mind that stuff like log(ab)=log(a)+log(b) doesn't necessarily carry through into the complex case.
 
That's very interesting. Thank you both.
 
Yes, it is possible to calculate [itex]e^x[/itex], [itex]ln(x)[/itex], [itex]sin(x)[/itex], etc. for x any object that you can add and multiply.

Just use the Taylor's series for the function.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K