Suppose we pick a matrix M\in M_n(ℝ) s.t. all its eigenvalues are strictly bigger than 1.(adsbygoogle = window.adsbygoogle || []).push({});

In the question here the user said it induces some norm (|||⋅|||) which "expands" vector in sense that exists constant c∈ℝ s.t. ∀x∈ℝ^n |||Ax||| ≥ |||x||| .

I still cannot understand why it's correct. How can one pick this norm explicitly? The comments suggested dividing to diagonal case and normal Jordan form but I cannot see in both cases how to define this norm.

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Norm indueced by a matrix with eigenvalues bigger than 1

Loading...

Similar Threads for Norm indueced matrix |
---|

I Eigenproblem for non-normal matrix |

A Eigenvalues and matrix entries |

A Badly Scaled Problem |

I Adding a matrix and a scalar. |

**Physics Forums | Science Articles, Homework Help, Discussion**