Exploring Invariant Matrix Norms: Understanding the Frobenius Norm and Beyond

It seems like you are asking if there are other matrix norms that are invariant under a change of basis, besides the Frobenius norm. The answer is yes, as long as the norm is compatible with the norm used for vectors. An example is the spectral norm, which is induced by the Euclidean norm and measures the maximum "stretch" of a matrix on a vector. In summary, there are multiple invariant matrix norms, such as the spectral norm, that can be used in a change of basis scenario.
  • #1
Heimdall
42
0
Hi,

I don't get which of the many matrix norms is invariant through a change of basis. I get that the Frobenius norm is, because it can be expressed as a function of the eigenvalues only. Are there others of such kind of invariant norms?

Thanks
 
Physics news on Phys.org
  • #2
By change of basis, I assume you mean changing from one orthonormal basis to another orthonormal basis. If so, then I think any matrix norm which is compatible with the norm you're using for vectors will be the same in either basis.

An important example is the spectral norm. This norm is induced by the Euclidean norm, which is just the usual way of defining "magnitude" for a real 2D or 3D vector. Roughly speaking, the spectral norm is the maximum amount that a matrix can "stretch" a vector.
 

What is an Invariant Matrix norm?

An Invariant Matrix norm is a mathematical concept used to measure the size or magnitude of a matrix. It is a function that assigns a non-negative value to every matrix and has the property of being invariant under certain transformations of the matrix.

What are some common examples of Invariant Matrix norms?

Some common examples of Invariant Matrix norms include the Frobenius norm, the maximum absolute row sum norm, and the spectral norm.

How is the Frobenius norm calculated?

The Frobenius norm of a matrix A is calculated by taking the square root of the sum of the squared elements of A. It is denoted as ||A||F and can be written as ||A||F = (∑∑|aij|2)1/2.

What is the significance of Invariant Matrix norms?

Invariant Matrix norms are important in linear algebra and matrix theory as they provide a way to measure the size or magnitude of a matrix. They also have various applications in fields such as data analysis, signal processing, and optimization.

How are Invariant Matrix norms used in machine learning?

Invariant Matrix norms are commonly used in machine learning algorithms to regularize or penalize the size of the weight or parameter matrices. This helps in preventing overfitting and improving the generalization ability of the model.

Similar threads

  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
20
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
908
  • Linear and Abstract Algebra
Replies
8
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
789
  • Linear and Abstract Algebra
Replies
14
Views
1K
Back
Top