Is the Induced Weighted Matrix Norm Equal to WAW^-1?

Click For Summary
SUMMARY

The discussion centers on proving that the induced weighted matrix norm, denoted as ||A||_W, is equal to ||WAW^{-1}||, where W is an invertible matrix. The participants derive the expression for ||A||_W using the definition of the weighted vector norm, leading to the inequality ||A||_W ≤ ||WAW^{-1}||. They explore the supremum property to establish the equality, ultimately confirming that ||A||_W = ||WAW^{-1}|| holds true under the conditions discussed.

PREREQUISITES
  • Understanding of weighted vector norms
  • Knowledge of induced matrix norms
  • Familiarity with properties of invertible matrices
  • Concept of supremum in mathematical analysis
NEXT STEPS
  • Study the properties of induced norms in linear algebra
  • Explore the implications of matrix invertibility on norm calculations
  • Learn about supremum and its applications in functional analysis
  • Investigate examples of weighted norms in practical applications
USEFUL FOR

Mathematicians, students studying linear algebra, and researchers in functional analysis who are interested in matrix norms and their properties.

pyroknife
Messages
611
Reaction score
4

Homework Statement


The weighted vector norm is defined as
##||x||_W = ||Wx||##.
W is an invertible matrix.

The induced weighted matrix norm is induced by the above vector norm and is written as:
##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W}##
A is a matrix.

Need to show ##||A||_W = ||WAW^{-1}||##

Homework Equations

The Attempt at a Solution


##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

For an induced norm we know that:
##\frac{||WAW^{-1}Wx||}{||Wx||} \leq \frac{||WAW^{-1}||||Wx||}{||Wx||} = ||WAW^{-1}||##Here is where I am lost. I already have gotten the expression into the form desired, but I do not know how to make the connection between ##sup_{x\neq0}## and the ##\leq##. My thought is that we are taking the supremum of ##\frac{||Ax||_W}{||x||_W}##, so it's maximum possible value is ##||WAW^{-1}||##
and thus
##||A||_W=||WAW^{-1}||##
Is this the right logic?
 
Physics news on Phys.org
The inequality holds for all ##x \neq 0## and therefore for the supremum, too.
 
fresh_42 said:
The inequality holds for all ##x \neq 0## and therefore for the supremum, too.
Yes I understand this, but I don't think this answers my question? Or maybe I do not actually understand?
 
pyroknife said:
##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

For an induced norm we know that:
##\frac{||WAW^{-1}Wx||}{||Wx||} \leq \frac{||WAW^{-1}||||Wx||}{||Wx||} = ||WAW^{-1}||##

Putting it together you already have:

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

## \leq sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} ||WAW^{-1}|| = ||WAW^{-1}||##

At least I cannot see why this shouldn't hold.
 
fresh_42 said:
Putting it together you already have:

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

## \leq sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} ||WAW^{-1}|| = ||WAW^{-1}||##

At least I cannot see why this shouldn't hold.
Yes. I don't see anything wrong with those steps either, but I am not sure how to eventually get to the answer
##||A||_W = ||WAW^{-1}||##. Right now we have it in the form ##||A||_W \leq ||WAW^{-1}||##
 
Right. I've forgotten the other direction. Give me some time to think about it.
 
fresh_42 said:
Right. I've forgotten the other direction. Give me some time to think about it.
Thanks. I think it is some property of the "supremum" that I may be missing.
 
If we have ##||A|| = sup_{x\neq 0} \frac{||Ax||}{||x||}## which is true for the matrix norm induced by the vector norm, then it's easy (I think).

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{y\neq 0} \frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##
 
fresh_42 said:
If we have ##||A|| = sup_{x\neq 0} \frac{||Ax||}{||x||}## which is true for the matrix norm induced by the vector norm, then it's easy (I think).

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{y\neq 0} \frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##
Hmm, I do not understand the step
##sup_{y\neq0}\frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##

Could you explain the equality?
 
  • #10
pyroknife said:
Hmm, I do not understand the step
##sup_{y\neq0}\frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##

Could you explain the equality?
The same as in the assumption that ##||A|| = sup_{x\neq 0}{\frac{||Ax||}{||x||}}## only with ##||WAW^{-1}||## instead of ##||A||## and ##y## instead of ##x##.
Only point is whether this is right for the matrix norm given to you but usually it holds.
 

Similar threads

Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
12K
Replies
3
Views
1K