Is the Induced Weighted Matrix Norm Equal to WAW^-1?

In summary: The same as in the assumption that ##||A|| = sup_{x\neq 0}{\frac{||Ax||}{||x||}}## only with ##||WAW^{-1}||## instead of ##||A||## and ##y## instead of ##x##.
  • #1
pyroknife
613
3

Homework Statement


The weighted vector norm is defined as
##||x||_W = ||Wx||##.
W is an invertible matrix.

The induced weighted matrix norm is induced by the above vector norm and is written as:
##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W}##
A is a matrix.

Need to show ##||A||_W = ||WAW^{-1}||##

Homework Equations

The Attempt at a Solution


##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

For an induced norm we know that:
##\frac{||WAW^{-1}Wx||}{||Wx||} \leq \frac{||WAW^{-1}||||Wx||}{||Wx||} = ||WAW^{-1}||##Here is where I am lost. I already have gotten the expression into the form desired, but I do not know how to make the connection between ##sup_{x\neq0}## and the ##\leq##. My thought is that we are taking the supremum of ##\frac{||Ax||_W}{||x||_W}##, so it's maximum possible value is ##||WAW^{-1}||##
and thus
##||A||_W=||WAW^{-1}||##
Is this the right logic?
 
Physics news on Phys.org
  • #2
The inequality holds for all ##x \neq 0## and therefore for the supremum, too.
 
  • #3
fresh_42 said:
The inequality holds for all ##x \neq 0## and therefore for the supremum, too.
Yes I understand this, but I don't think this answers my question? Or maybe I do not actually understand?
 
  • #4
pyroknife said:
##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

For an induced norm we know that:
##\frac{||WAW^{-1}Wx||}{||Wx||} \leq \frac{||WAW^{-1}||||Wx||}{||Wx||} = ||WAW^{-1}||##

Putting it together you already have:

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

## \leq sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} ||WAW^{-1}|| = ||WAW^{-1}||##

At least I cannot see why this shouldn't hold.
 
  • #5
fresh_42 said:
Putting it together you already have:

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

## \leq sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} ||WAW^{-1}|| = ||WAW^{-1}||##

At least I cannot see why this shouldn't hold.
Yes. I don't see anything wrong with those steps either, but I am not sure how to eventually get to the answer
##||A||_W = ||WAW^{-1}||##. Right now we have it in the form ##||A||_W \leq ||WAW^{-1}||##
 
  • #6
Right. I've forgotten the other direction. Give me some time to think about it.
 
  • #7
fresh_42 said:
Right. I've forgotten the other direction. Give me some time to think about it.
Thanks. I think it is some property of the "supremum" that I may be missing.
 
  • #8
If we have ##||A|| = sup_{x\neq 0} \frac{||Ax||}{||x||}## which is true for the matrix norm induced by the vector norm, then it's easy (I think).

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{y\neq 0} \frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##
 
  • #9
fresh_42 said:
If we have ##||A|| = sup_{x\neq 0} \frac{||Ax||}{||x||}## which is true for the matrix norm induced by the vector norm, then it's easy (I think).

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{y\neq 0} \frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##
Hmm, I do not understand the step
##sup_{y\neq0}\frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##

Could you explain the equality?
 
  • #10
pyroknife said:
Hmm, I do not understand the step
##sup_{y\neq0}\frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##

Could you explain the equality?
The same as in the assumption that ##||A|| = sup_{x\neq 0}{\frac{||Ax||}{||x||}}## only with ##||WAW^{-1}||## instead of ##||A||## and ##y## instead of ##x##.
Only point is whether this is right for the matrix norm given to you but usually it holds.
 

What is an induced weighted matrix norm?

An induced weighted matrix norm is a way of measuring the size of a matrix, taking into account the weights of the individual elements. It is a generalization of the traditional matrix norm, which assigns equal weight to all elements of the matrix.

How is an induced weighted matrix norm calculated?

An induced weighted matrix norm is calculated by multiplying the matrix by a weight matrix, which assigns a weight to each element, and then taking the maximum absolute value of the resulting vector. The weight matrix can be chosen according to different criteria, such as the importance of each element or the desired properties of the norm.

Why is an induced weighted matrix norm useful?

An induced weighted matrix norm is useful because it allows for a more tailored measurement of the size of a matrix. By assigning different weights to the elements, it can capture different aspects of the matrix, such as sparsity or symmetry, that may not be captured by the traditional matrix norm.

What are some applications of induced weighted matrix norms?

Induced weighted matrix norms have various applications in fields such as physics, engineering, and computer science. They are used in optimization problems, data analysis, and the study of linear transformations in vector spaces, among others.

Are there different types of induced weighted matrix norms?

Yes, there are several types of induced weighted matrix norms, including the Frobenius norm, the spectral norm, and the operator norm. Each type has its own properties and applications, and the choice of norm depends on the specific problem at hand.

Similar threads

Replies
1
Views
570
  • Calculus and Beyond Homework Help
Replies
1
Views
554
  • Calculus and Beyond Homework Help
Replies
2
Views
382
  • Calculus and Beyond Homework Help
Replies
2
Views
705
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
781

Back
Top