About invertible and diagonalizable matrices

  • Thread starter Thread starter jostpuur
  • Start date Start date
  • Tags Tags
    Matrices
Click For Summary
SUMMARY

This discussion centers on the proof of the inverse function theorem as presented in a book on differential manifolds. The key point of contention is whether the assumption that all invertible matrices are diagonalizable is valid, which is clarified as incorrect. The conversation also delves into the application of the mean value theorem in the context of estimating the norm of a matrix function, specifically addressing the implications of the operator norm and the presence of the term ||g(0)|| in the proof. The participants emphasize the need for careful justification when applying the mean value theorem, particularly regarding differentiability.

PREREQUISITES
  • Understanding of the inverse function theorem
  • Familiarity with Jacobian matrices and their properties
  • Knowledge of the mean value theorem in calculus
  • Concept of operator norms in linear algebra
NEXT STEPS
  • Study the properties of invertible matrices and their diagonalizability
  • Learn about the inverse function theorem in detail
  • Explore the application of the mean value theorem in higher dimensions
  • Investigate operator norms and their significance in matrix analysis
USEFUL FOR

Mathematicians, students of advanced calculus, and anyone studying differential geometry or linear algebra who seeks a deeper understanding of matrix properties and theorems related to differentiability.

jostpuur
Messages
2,112
Reaction score
19
Hello, I'm reading this book http://freescience.info/go.php?pagename=books&id=1041 about differential manifolds. In the appendix this book gives a proof for the inverse function theorem. It assumes that the Jacobian matrix Df_a is invertible (where a is a location where it is calculated), and then it says: "By an affine transformation x\mapsto Ax+b we can assume that a=0 and Df_a=1." Isn't this the same thing, as assuming that all invertible matrices are diagonalizable? And isn't that assumption wrong?
 
Physics news on Phys.org
No, it is not the same thing. It is simply taking A to be the inverse of Dfa.
 
All right :blushing:
 
about use of the mean value theorem

I haven't made much progress with this proof. The problem is not anymore about diagonalizability (well it wasn't in the beginning either...), but since I started talking about this proof here, I might as well continue it here.

First it says that there exists such r, that for ||x||<2r, ||Dg_x|| < 1/2 holds. A small question: When a norm of a matrix is written without explanations, does it usually mean the operator norm ||A||:=\textrm{sup}_{||x||<1} ||Ax||? Anyway, then it says that "It follows from the mean value theorem that ||g(x)|| < ||x||/2". I encountered some problems in this step.

Doesn't the mean value theorem in this case say, that there exists such 0\leq\lambda\leq 1, that
<br /> ||g(x)|| = \big(\frac{d}{d\lambda&#039;}||g(\lambda&#039; x)||\big) \Big|_{\lambda&#039;=\lambda} + ||g(0)||<br />
I computed
<br /> \frac{d}{d\lambda&#039;}||g(\lambda&#039; x)|| = \sum_{i=1}^n \frac{g_i(\lambda&#039;x) (x\cdot\nabla g_i(\lambda&#039;x))}{||g(\lambda&#039;x)||} = \frac{g^T(\lambda&#039;x) (Dg_{\lambda&#039;x}) x}{||g(\lambda&#039;x)||}<br />
after which I could estimate
<br /> ||g(x)|| \leq ||Dg_{\lambda x}||\; ||x|| + ||g(0)|| \leq \frac{1}{2}||x|| + ||g(0)||<br />
A big difference is that the proof in the book didn't have ||g(0)|| term. Perhaps that is not a big problem, we can get rig of it by redefining the original function with some translation of the image. Although it is strange that it was not mentioned in the proof, so is there some thing that I'm already getting wrong here?

Another matter is, that the mapping \lambda\mapsto ||g(\lambda x)|| is not nessecarily differentiable, if g reaches zero with some lambda, and I cannot see how to justify that g would remain nonzero here. So the use of the mean value theorem doesn't seem fully justified.
 
Last edited:
I think I got this matter handled.

At least my problem is not, that I could not think complicatedly enough.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
3K
Replies
3
Views
1K
Replies
2
Views
657
Replies
9
Views
8K
  • · Replies 9 ·
Replies
9
Views
4K
Replies
7
Views
3K