- #1
GridironCPJ
- 44
- 0
What conditions most be true for these two norms to be equal? Or are they always equal?
GridironCPJ said:What conditions most be true for these two norms to be equal? Or are they always equal?
Hawkeye18 said:The Frobenius and 2-norm of a matrix coincide if and only if the matrix has rank 1 (i.e. if and only if the matrix can be represented as A=c r, where r is a row and c is a column).
You can see that from the fact that Frobenius norm is [itex]\left( \sum_k s_k^2\right)^{1/2}[/itex] and the 2-norm is [itex]\max s_k[/itex], where [itex]s_k[/itex] are singular values. So equality happens if and only if there is only one non-zero singular value, which is equivalent to the fact that the rank is 1.
Hawkeye18 said:The Frobenius and 2-norm of a matrix coincide if and only if the matrix has rank 1
AlephZero said:More generally, ##||A||_2 \le ||A||_F \le \sqrt{r}||A||_2## where r is the rank of A.
Assuming you accept Hawkeye18's formulas, namelytomz said:May you shed some light on this? Or quote any possible reference? Thanks
The Frobenius norm of a matrix is a measure of its magnitude, calculated by taking the square root of the sum of the squares of all its elements. The 2-norm, also known as the spectral norm, is the largest singular value of the matrix, which is related to the maximum stretch factor of the matrix.
The Frobenius norm is equal to the 2-norm of a matrix when the matrix is symmetric and positive definite. This means that the matrix has a full set of real eigenvalues, all of which are greater than zero.
No, the Frobenius norm is always smaller or equal to the 2-norm of a matrix. This is because the 2-norm is the maximum singular value of the matrix, while the Frobenius norm is the root sum of squares of all singular values.
The Frobenius norm is equivalent to the Euclidean norm when applied to vectors, as both are calculated by taking the square root of the sum of the squares of the vector's components. However, for matrices, the Frobenius norm is the square root of the sum of the squares of all elements, while the Euclidean norm is not defined for matrices.
The Frobenius norm is commonly used in machine learning and statistics because it is a convenient way to measure the difference between two matrices. It is also a useful metric for regularization methods, as it penalizes larger matrices more heavily than smaller ones, encouraging simpler and more generalizable models.