# Calculating norm of a matrx

1. Sep 20, 2008

### Pacopag

1. The problem statement, all variables and given/known data
I am trying to show that
(1) $$||A||_1 = \max_j \sum_i |a_{ij}|$$
(2) $$||A||_2 = \sqrt{max\_eigval\_ of\_{ } A^* A}$$ where A* is the conjugate transpose
(3) $$||A||_\infty = \max_i \sum_i |a_{ij}|$$

2. Relevant equations
In general,
$$||A||_p = max_{x\neq 0} {{||Ax||_p}\over{||x||_p}}$$
where for a vector we have
$$||x||_p = \left( \sum_i |x_i|^p \right)^{1\over p}$$

3. The attempt at a solution
First of all, write
$$(Ax)_i = \sum_j a_{ij} x_j$$.
(1) Here we have
$$||Ax||_1 = max_{x} {{\sum_i \left|\sum_j a_{ij} x_j \right|}\over{\sum_j |x_j|}}$$.
I'm pretty sure we could switch the summation order in the numerator too. But I don't know how to proceed from here.
(2) Here we have
$$||Ax||_2^2 = max_x {{x^* A^* A x}\over{x^* x}}$$ using $$||x||_2^2 =x^* x$$
Now, I read somewhere that the x that maximizes this is an eigenvector of $$A^* A$$. So we get
$$||Ax||_2^2 = \lambda_{max}$$, the largest eigenvalue.
My only problem with this is: How do we know that an eigenvector will maximize it?
(3) Here we have
$$||Ax||_\infty = max_x {{max_i \left( \left| \sum_j a_{ij} x_j \right| \right)}\over{max_j (|x_j|)}}$$
using
$$||x||_\infty = max_j (x_j)$$.
Again, I don't know how to proceed from here. It seems that in order to arrive at the answer (which I like to pretend that I don't know),
I would need to choose x to be a vector where all the components are the same. Then the x_j's would cancel and i get the right answer.
But I don't see why such an x is guaranteed to maximize ||A||.

Last edited: Sep 20, 2008
2. Sep 21, 2008

### morphism

For (1), note that, for a fixed i,

$$\left| \sum_j a_{ij} x_j \right| \leq \sum_j |a_{ij}| |x_j| \leq \max_j |a_{ij}| \sum_j |x_j|.$$

Use this to conclude that $\|Ax\|_1 \leq \max_j \sum_i |a_{ij}|$. Can you find unit vectors to help guarantee that equality is achieved?

A similar technique can be applied to (3).

As for (2), note that A*A is Hermitian, so we can unitarily diagonalize it. Let B={v_1, ..., v_n} be an orthonormal basis for C^n consisting of eigenvectors of A*A. Writing $x=k_1 v_1+\cdots+k_n v_n$, we see that (with respect to the basis B)

$$x^*A^*Ax = (k_1^*, \ldots, k_n^*) \left(\begin{array}{cccc} \lambda_1 \\ & \lambda 2 \\ & & \ddots \\ & & & \lambda_n \end{array}\right) \left(\begin{array}{c}{k_1 \\ \vdots \\ k_n \end{array}\right) = \sum_j \lambda_j |k_j|^2.$$

Can you take it from here?

3. Sep 21, 2008

### Pacopag

Yes! That is a tremendous help. Thank you very much.
Is the answer to your question "Can you find unit vectors to help guarantee that equality is achieved?" a vector
$$x=\left(\begin{array}{c}{0 \\ 0 \\ \vdots \\ 0 \\ 1 \\ 0 \\ \vdots \\0 \end{array}\right)$$
where the 1 is in the max_j-th row?

4. Sep 21, 2008

### morphism

Yup, that'll get you that $\sum_i |a_{ij}| \leq \|Ax\|_1$ for each j.