Nonsingular perturbed identity matrix

  • Thread starter Thread starter Codezion
  • Start date Start date
  • Tags Tags
    Identity Matrix
Click For Summary
SUMMARY

The discussion centers on the nonsingular perturbed identity matrix defined as A = I + uv*, where u and v are vectors of dimension m. Participants clarify that A is not necessarily orthogonal, despite being nonsingular, and that the inverse can be expressed as A-1 = I + αuv*, where α = -1/(1 + β) and β = v*u. The conversation emphasizes the importance of correctly applying matrix properties and definitions, particularly in relation to orthogonality and the calculation of inverses.

PREREQUISITES
  • Understanding of matrix operations, specifically matrix multiplication and addition.
  • Familiarity with concepts of orthogonal matrices and their properties.
  • Knowledge of matrix inverses and the conditions for nonsingularity.
  • Basic linear algebra, including vector spaces and inner products.
NEXT STEPS
  • Study the properties of orthogonal matrices and their implications in linear algebra.
  • Learn about the derivation and application of the Sherman-Morrison formula for matrix inverses.
  • Explore the concept of rank in matrices and its relationship to linear independence.
  • Investigate the implications of singular matrices and their null spaces in linear transformations.
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, matrix theory, and related fields. This discussion is beneficial for anyone looking to deepen their understanding of matrix inverses and properties of nonsingular matrices.

Codezion
Messages
17
Reaction score
0

Homework Statement


If A = I + uv*, where u and v are m vectors and A is known to be nonsingular, show that the inverse of A = I + \alphauv* where \alpha is a scalar value


Homework Equations





The Attempt at a Solution


Since A is nonsingular, we know the rank of A is m.
Since both u and v are vectors of dimension m, we know that uv* is a square matrix

From the above statements, we conclude that A is an orthogonal matrix which imples:

A^{-1}
= A ^{*}
= (I + uv*)*
= I + (uv*)*
= I + u*v
= I + \alpha[\tex], since u*v is a scalar value<br /> <br /> As you can see, I am arriving at the wrong conclusion. What am I missing? Thank you!
 
Physics news on Phys.org
I note two problems:

(1) I have no idea how you conclude A should be orthogonal.
(2) (AB)* = B*A*
 
(1) - I think one of the definition of an Orthogonal matrix is a square matrix, whose columns are all linearly independent so that they make the basis for an m space. One of the properties of a nonsingular matrix is that it has the maximum rank (in this case m since it is a square matrix). These statements led me to conclude our matrix A is orthogonal!

(2) - That is a correct property...did I miss something?
 
Codezion said:
(1) - I think one of the definition of an Orthogonal matrix is a square matrix, whose columns are all linearly independent so that they make the basis for an m space.
You forgot a key word in that characterization: they have to make an orthonormal basis.

(2) - That is a correct property...did I miss something?
(Lemme switch to S and T instead of A and B, to eliminate confusion)

Yes. The property is (ST)*=T*S*, but you used it in your calculation as (ST)*=S*T*.
 
Wow! Thank you I missed that! I was assuming (AB)* = A*B* all along - even after I read your post! Thank you.

If A is not an orthogonal matrix, then I guess I am back to square one :(
 
Simple is often best, even if it's just to get ideas. Rather than looking for a clever argument, why not test directly whether or not this alledged inverse to A really is?
 
The matrices are inverses if (I+uv*)(I+alpha uv*)x=x for any vector x. Expand it out and try to figure out a value of alpha that will make that true.
 
Thank you! Figured it out :)

(I + uv*) (I + alpha uv*)x = x
(I + alpha Iuv* + Iuv* + alpha uv*uv*)x = x
v*u is a scalar and let it be beta
(I + alpha Iuv* + Iuv* + alpha beta uv*)x = x
[I + Iuv*(alpha + 1 + alpha beta)]x = x
this implies (alpha + 1 + alpha beta) = 0!
alpha = -1/(1+beta)!
 
Note that x was really just a placebo. You could have gone straight to the definition of inverse, and put an I on the right hand side, without any of the x's, and done exactly the same thing.
 
  • #10
I like the expression (placebo). Yup, thanks Hurkyl. Thanks so much.
 
  • #11
If beta = -1 (which would imply A is singular), what would the nullspace of A be? I got only as far as writing the equation as:

(I+uv*)x = 0
x + uv*x = 0...

Do you have any suggestions?
 
  • #12
Well, if it's going to be at all easy to figure out, it's probably going to be some expression involving u or v. (I+uv*)x = 0 is a linear equation, so I would start by plugging in a few things, and see if I can combine them to make 0.
 
  • #13
Codezion said:
If beta = -1 (which would imply A is singular), what would the nullspace of A be? I got only as far as writing the equation as:

(I+uv*)x = 0
x + uv*x = 0...

Do you have any suggestions?

Sure you don't want to take a crack at this without a hint? It's pretty easy. Remember beta=v*u=(-1).
 
  • #14
Thanks a lot for your help guys.
Dick, I think it could be the pressure:), but I still cannot see it. Here is what I did

v*x is some constant beta,
x + uv*x = x + beta u
=> x = -beta u
Therefore, I concluded x is a scalar multiple of u.

This is what I came up with, but it looks dodgy because I somewhat defined beta with x itself..?
 
  • #15
OH! I see it now :) if x = u, then

x + uv*x = u + uv*u = u + -u = 0

You are right Dick, it was easy! :)

Thanks a bunch!
 
  • #16
Codezion said:
This is what I came up with, but it looks dodgy because I somewhat defined beta with x itself..?
I think you confused yourself, because you now have two variables named beta! One of them was the inner product of u with v, and the other is the inner product of x with v.

But no matter; you can test directly if u is a right null-vector or not (I now see you've done that). Of course, you'll have to clean up your proof if you want to claim that the right nullspace is one-dimensional.

(p.s. as an alternative, there is a rank-counting argument that tells you that A cannot be less than n-1 dimensional. Can you see it? Hint: it might be easier to think instead of the equation I = A - uv*)
 
Last edited:

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K