2-norm Pseudoinverse Upper Bound

In summary, the conversation discusses an upper bound on the matrix 2-norm and how to show its validity. The conversation also mentions the use of the Moore-Penrose pseudoinverse and the properties of left invertible matrices. Ultimately, it is shown that the upper bound holds for matrices with certain dimensions and that the norm of the minimal left inverse plays a key role in this proof.
  • #1
baggiano
13
0
Hello

I'm trying to show that the following upper bound on the matrix 2-norm is true:

[itex]\left\|(AB)^+\right\|_2\leq\left\|A^+\right\|_2 \left\|B^+\right\|_2[/itex]

where + is the matrix pseudoinverse and [itex]A\in\Re^{n\times m}[/itex] and [itex]B\in\Re^{m\times p}[/itex] are full-rank matrices with [itex]n\geq m\geq p[/itex].

Any hint how I can show it?

Thanks in advance!

Bag
 
Last edited:
Physics news on Phys.org
  • #2
Are you sure that [itex]m\ge p[/itex] and not [itex]p\ge m[/itex]?
Also I assume that by pseudoinverse you mean the Moore-Penrose pseudoinverse.

If [itex]n\ge m[/itex] and [itex]p\ge m[/itex], then [itex](AB)^+ =B^+ A^+[/itex] holds, and this identity gives your estimate.

If [itex]n\ge m\ge p[/itex], then [itex]A[/itex], [itex]B[/itex] and so [itex]AB [/itex] are left invertible. For a left invertible matrix [itex]A [/itex], the Moore-Penrose pseudoinverse [itex]A^+ [/itex] is the minimal left inverse, [itex]A^{min}_L = (A^*A)^{-1}A^* [/itex].

The minimal left inverse [itex] A^{min}_L [/itex] has the property that for any other left inverse [itex]A_L [/itex] of [itex]A [/itex] we have [itex] A^{min}_L = P_{Ran (A)} A_L[/itex], where [itex] P_{Ran (A)} [/itex] is the orthogonal projection onto the range (column space) of [itex]A[/itex]. In particular, this implies that the norm of the minimal left inverse is the minimal possible norm of a left inverse.

Now let us gather all this information together: both [itex] A [/itex] and [itex]B [/itex], are left invertible, so [itex] A^+ [/itex] and [itex] B^+ [/itex] are the minimal left inverses of [itex] A [/itex] and [itex] B [/itex] respectively. Therefore, [itex] B^+A^+ [/itex] is a left inverse of [itex] AB [/itex]; generally it is not the minimal left inverse, but the minimality property for the norm of minimal left inverse implies

[itex]
\|(AB)^+\| \le \| B^+A^+\| \le\|B^+\|\cdot\|A^+\|
[/itex]
 
  • #3
Hello

Thanks a lot for the details of your illustration.

Yes, [itex]m\geq p[/itex] was actually correct.

In the fourth paragraph of your reply you say that [itex]A_L^{min}=P_{Ran(A)}A_L[/itex]. Shouldn't it be [itex]A_L^{min}=A_LP_{Ran(A)}[/itex] instead?

Thanks again!

Bag
 
  • #4
Yes, it should be [itex]A_L^{min} = A_L^{min} P_{Ran(A)}[/itex].
 
  • #5
us

Hello Bagus,

Thank you for your question. It is an interesting problem to consider. To show the upper bound on the matrix 2-norm, we can use the definition of the pseudoinverse and the properties of matrix norms.

First, let's define the pseudoinverse of a matrix A as A+ = (A*A)^(-1)*A^T. This definition is valid for any full-rank matrix A. We can also define the 2-norm of a matrix A as ||A||_2 = \sqrt{\lambda_{max}(A^T*A)}, where \lambda_{max} represents the maximum eigenvalue.

Now, let's consider the matrix (AB)+. Using the definition of the pseudoinverse, we can write (AB)+ = ((AB)*(AB))^(-1)*(AB)^T. Then, we can use the properties of matrix norms to write ||(AB)+||_2 = \sqrt{\lambda_{max}((AB)^T*(AB))}.

Next, let's consider the matrix A+. Using the properties of matrix norms, we can write ||A+||_2 = \sqrt{\lambda_{max}(A^T*A)}. Similarly, we can write ||B+||_2 = \sqrt{\lambda_{max}(B^T*B)}.

Now, we can combine these equations to obtain ||(AB)+||_2 = \sqrt{\lambda_{max}((AB)^T*(AB))} = \sqrt{\lambda_{max}(B^T*A^T*A*B)}.

Since A and B are full-rank matrices, we know that the product A^T*A and B^T*B are also full-rank. This means that their maximum eigenvalues are positive, which allows us to write \lambda_{max}(B^T*A^T*A*B) = \lambda_{max}(B^T*B)*\lambda_{max}(A^T*A).

Substituting this into our equation for ||(AB)+||_2, we get ||(AB)+||_2 = \sqrt{\lambda_{max}(B^T*B)*\lambda_{max}(A^T*A)}.

Finally, using the properties of square roots and maximum eigenvalues, we can write ||(AB)+||_2 = \sqrt{\lambda_{max}(B^T*B)}*\sqrt{\lambda_{max}(A^T*A)}
 

1. What is the 2-norm Pseudoinverse Upper Bound?

The 2-norm Pseudoinverse Upper Bound is a mathematical concept used in linear algebra. It refers to the upper bound of the 2-norm of the pseudoinverse of a matrix, which is a generalized inverse of a matrix.

2. How is the 2-norm Pseudoinverse Upper Bound calculated?

The 2-norm Pseudoinverse Upper Bound is calculated by taking the maximum singular value of the matrix and dividing it by the minimum singular value of the matrix.

3. What is the significance of the 2-norm Pseudoinverse Upper Bound?

The 2-norm Pseudoinverse Upper Bound is used to measure the sensitivity of a matrix's pseudoinverse. It can also be used to determine the minimum condition number of a matrix, which is a measure of how well-conditioned or ill-conditioned a matrix is.

4. How is the 2-norm Pseudoinverse Upper Bound used in practical applications?

The 2-norm Pseudoinverse Upper Bound can be used in a variety of practical applications, including data compression, signal processing, and image reconstruction. It can also be used in machine learning algorithms and optimization problems.

5. Are there any limitations to the 2-norm Pseudoinverse Upper Bound?

Yes, there are limitations to the 2-norm Pseudoinverse Upper Bound. It assumes that the matrix is full rank, meaning it has linearly independent columns. If the matrix is not full rank, the 2-norm Pseudoinverse Upper Bound may not accurately reflect the sensitivity of the pseudoinverse. Additionally, it may not be applicable to non-linear systems or matrices with complex values.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
784
  • Linear and Abstract Algebra
Replies
1
Views
975
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
34
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
4K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
20
Views
1K
  • Advanced Physics Homework Help
Replies
2
Views
827
  • Calculus and Beyond Homework Help
Replies
4
Views
599
Back
Top