How Can We Find the Minimum 2-Norm of Ax-y Using an Orthogonal Matrix?

  • MHB
  • Thread starter mathmari
  • Start date
  • Tags
    Minimum
In summary, the equation states that the minimum of the absolute value of the difference of two vectors is found by solving the equation $\|Ux-(Q^{T}y)_1^n\|_2^2=\|(Q^{T}y)_{n+1}^m\|_2^2$.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

Let $A = QR$, where $Q$ is an orthogonal ($m\times m$)−matrix and $R$ is an upper ($m\times n$)-triangular matrix of rang $n$ ($m>n$).

I want to show that $$\min_{x\in \mathbb{R}^n}\|Ax-y\|_2=\|(Q^Ty)_{n+1}^m\|_2, \ \forall y\in \mathbb{R}^m$$

It is $(a)_k^l=(a_k, \ldots , a_l)^T$ if $a=(a_1, \ldots , a_l)^T\in \mathbb{R}^l$.
I have done the following:

\begin{align*}\min_{x\in \mathbb{R}^n}\|Ax-y\|_2&=\min_{x\in \mathbb{R}^n}\|QRx-y\|_2=\min_{x\in \mathbb{R}^n}\|QRx-QQ^{-1}y\|_2 \\ & =\min_{x\in \mathbb{R}^n}\|Q(Rx-Q^{T}y)\|_2=\min_{x\in \mathbb{R}^n}\|Rx-Q^{T}y\|_2\end{align*}

(We have used here the properties of an orthogonal matrix.)

How could we continue? How can we find that minimum? (Wondering)
 
Mathematics news on Phys.org
  • #2
Hey mathmari!

We can write $Rx$ as $\binom U0x$ where $U$ is an up upper triangular nxn matrix of rank n, and 0 is a zero matrix, can't we? (Wondering)
It means that:
$$\|Rx-Q^{T}y\|^2
= \|Ux-(Q^{T}y)_1^n\|^2 + \|0-(Q^{T}y)_{n+1}^m\|^2
$$
Can we solve it now? (Wondering)
 
  • #3
I like Serena said:
Hey mathmari!

We can write $Rx$ as $\binom U0x$ where $U$ is an up upper triangular nxn matrix of rank n, and 0 is a zero matrix, can't we? (Wondering)
It means that:
$$\|Rx-Q^{T}y\|^2
= \|Ux-(Q^{T}y)_1^n\|^2 + \|0-(Q^{T}y)_{n+1}^m\|^2
$$
Can we solve it now? (Wondering)

Ah ok! We have that \begin{align*}\|Rx-Q^{T}y\|_2^2&=\left \|\begin{pmatrix}U \\ 0\end{pmatrix}x-\begin{pmatrix}(Q^T)_1^n \\ (Q^T)_{n+1}^m\end{pmatrix}y\right \|_2^2 =\left \|\begin{pmatrix}Ux-(Q^Ty)_1^n \\ 0-(Q^Ty)_{n+1}^m\end{pmatrix}\right \|_2^2 \\ & =\|Ux-(Q^{T}y)_1^n\|_2^2 + \|(Q^{T}y)_{n+1}^m\|_2^2\end{align*}

Therefore, we get $$\min_{x\in\mathbb{R}^n}\|Rx-Q^{T}y\|^2=\min_{x\in\mathbb{R}^n}\{\|Ux-(Q^{T}y)_1^n\|_2^2 + \|(Q^{T}y)_{n+1}^m\|_2^2\}$$

This is minimized for that $x$ for which it holds $Ux=(Q^{T}y)_1^n$ and so we get $$\min_{x\in \mathbb{R}^n}\|Ax-y\|_2^2=\|(Q^{T}y)_{n+1}^m\|_2^2\Rightarrow \min_{x\in \mathbb{R}^n}\|Ax-y\|_2=\|(Q^{T}y)_{n+1}^m\|_2$$ right? (Wondering)
 
  • #4
Yep.
And that is possible because $R$ is of rank $n$, and therefore $U$ is as well, making it an invertible matrix. (Nod)
 
  • #5
I like Serena said:
And that is possible because $R$ is of rank $n$, and therefore $U$ is as well, making it an invertible matrix. (Nod)

Ok! Thanks a lot! (Yes)
 

1. What is the 2-Norm of Ax-y?

The 2-Norm of Ax-y is a measure of the difference between the vector Ax and the vector y. It is calculated by taking the square root of the sum of the squared differences between each element in Ax and y.

2. Why is minimizing the 2-Norm of Ax-y important?

Minimizing the 2-Norm of Ax-y is important because it allows us to find the best possible approximation of the vector y using the matrix A. This can be useful in a variety of applications, such as data analysis and signal processing.

3. How is the 2-Norm of Ax-y minimized?

The 2-Norm of Ax-y can be minimized using various mathematical techniques, such as least squares regression or gradient descent. These methods involve adjusting the values of the matrix A to find the best fit for the vector y.

4. What are the applications of minimizing the 2-Norm of Ax-y?

Minimizing the 2-Norm of Ax-y has a wide range of applications in different fields, including statistics, machine learning, and engineering. It can be used for solving linear systems of equations, data fitting, and noise reduction, among others.

5. Are there any limitations to minimizing the 2-Norm of Ax-y?

While minimizing the 2-Norm of Ax-y can be a powerful tool, it does have some limitations. For example, it may not be suitable for non-linear problems or when the matrix A is ill-conditioned. Additionally, the results may vary depending on the choice of optimization method and initial values.

Similar threads

Replies
3
Views
1K
Replies
5
Views
2K
  • Topology and Analysis
Replies
24
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
964
  • Linear and Abstract Algebra
Replies
34
Views
2K
  • Math Proof Training and Practice
2
Replies
42
Views
6K
  • Math Proof Training and Practice
2
Replies
61
Views
9K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Math Proof Training and Practice
3
Replies
86
Views
9K
Back
Top