Is the Induced Weighted Matrix Norm Equal to WAW^-1?

Click For Summary

Homework Help Overview

The discussion revolves around the properties of induced weighted matrix norms, specifically examining the relationship between the induced weighted matrix norm of a matrix \( A \) and the expression \( ||WAW^{-1}|| \), where \( W \) is an invertible matrix. Participants are exploring the definitions and implications of these norms in the context of linear algebra.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants are attempting to derive the equality \( ||A||_W = ||WAW^{-1}|| \) by manipulating the definitions of the norms. They discuss the supremum and its properties, questioning how the inequality \( ||A||_W \leq ||WAW^{-1}|| \) can be established and whether the reverse direction holds. Some participants express uncertainty about specific steps in the reasoning process.

Discussion Status

The discussion is ongoing, with participants actively engaging in reasoning about the properties of norms and supremum. Some have provided insights into the relationship between the norms, while others are seeking clarification on specific steps in the derivation. There is a recognition of the need to explore both directions of the inequality.

Contextual Notes

Participants are working under the constraints of a homework assignment, which may limit the information available for discussion. There is an emphasis on understanding the definitions and properties of norms without arriving at a definitive conclusion.

pyroknife
Messages
611
Reaction score
4

Homework Statement


The weighted vector norm is defined as
##||x||_W = ||Wx||##.
W is an invertible matrix.

The induced weighted matrix norm is induced by the above vector norm and is written as:
##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W}##
A is a matrix.

Need to show ##||A||_W = ||WAW^{-1}||##

Homework Equations

The Attempt at a Solution


##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

For an induced norm we know that:
##\frac{||WAW^{-1}Wx||}{||Wx||} \leq \frac{||WAW^{-1}||||Wx||}{||Wx||} = ||WAW^{-1}||##Here is where I am lost. I already have gotten the expression into the form desired, but I do not know how to make the connection between ##sup_{x\neq0}## and the ##\leq##. My thought is that we are taking the supremum of ##\frac{||Ax||_W}{||x||_W}##, so it's maximum possible value is ##||WAW^{-1}||##
and thus
##||A||_W=||WAW^{-1}||##
Is this the right logic?
 
Physics news on Phys.org
The inequality holds for all ##x \neq 0## and therefore for the supremum, too.
 
fresh_42 said:
The inequality holds for all ##x \neq 0## and therefore for the supremum, too.
Yes I understand this, but I don't think this answers my question? Or maybe I do not actually understand?
 
pyroknife said:
##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

For an induced norm we know that:
##\frac{||WAW^{-1}Wx||}{||Wx||} \leq \frac{||WAW^{-1}||||Wx||}{||Wx||} = ||WAW^{-1}||##

Putting it together you already have:

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

## \leq sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} ||WAW^{-1}|| = ||WAW^{-1}||##

At least I cannot see why this shouldn't hold.
 
fresh_42 said:
Putting it together you already have:

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}Wx||}{||Wx||}##

## \leq sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} \frac{||WAW^{-1}||||Wx||}{||Wx||} = sup_{x\neq 0} ||WAW^{-1}|| = ||WAW^{-1}||##

At least I cannot see why this shouldn't hold.
Yes. I don't see anything wrong with those steps either, but I am not sure how to eventually get to the answer
##||A||_W = ||WAW^{-1}||##. Right now we have it in the form ##||A||_W \leq ||WAW^{-1}||##
 
Right. I've forgotten the other direction. Give me some time to think about it.
 
fresh_42 said:
Right. I've forgotten the other direction. Give me some time to think about it.
Thanks. I think it is some property of the "supremum" that I may be missing.
 
If we have ##||A|| = sup_{x\neq 0} \frac{||Ax||}{||x||}## which is true for the matrix norm induced by the vector norm, then it's easy (I think).

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{y\neq 0} \frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##
 
fresh_42 said:
If we have ##||A|| = sup_{x\neq 0} \frac{||Ax||}{||x||}## which is true for the matrix norm induced by the vector norm, then it's easy (I think).

##||A||_W = sup_{x\neq 0} \frac{||Ax||_W}{||x||_W} = sup_{x\neq 0} \frac{||WAx||}{||Wx||} = sup_{y\neq 0} \frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##
Hmm, I do not understand the step
##sup_{y\neq0}\frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##

Could you explain the equality?
 
  • #10
pyroknife said:
Hmm, I do not understand the step
##sup_{y\neq0}\frac{||WAW^{-1}y||}{||y||} = ||WAW^{-1}||##

Could you explain the equality?
The same as in the assumption that ##||A|| = sup_{x\neq 0}{\frac{||Ax||}{||x||}}## only with ##||WAW^{-1}||## instead of ##||A||## and ##y## instead of ##x##.
Only point is whether this is right for the matrix norm given to you but usually it holds.
 

Similar threads

Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
12K
Replies
3
Views
2K