Nonsingular perturbed identity matrix

In summary, we are trying to show that the inverse of A can be represented as I + αuv*, where A = I + uv* and u and v are m-dimensional vectors. We first note that since A is nonsingular, its rank is m. As both u and v are m-dimensional vectors, uv* is a square matrix. From this, we can conclude that A is an orthogonal matrix. However, a mistake was made in using the property (ST)* = T*S* in the calculation, leading to an incorrect conclusion. To find the correct value of α, we directly test the equation (I+uv*)(I+αuv*)x = x for any vector x, which leads to the solution α = -
  • #1
Codezion
18
0

Homework Statement


If A = I + uv*, where u and v are m vectors and A is known to be nonsingular, show that the inverse of A = I + [tex]\alpha[/tex]uv* where [tex]\alpha[/tex] is a scalar value


Homework Equations





The Attempt at a Solution


Since A is nonsingular, we know the rank of A is m.
Since both u and v are vectors of dimension m, we know that uv* is a square matrix

From the above statements, we conclude that A is an orthogonal matrix which imples:

A[tex]^{-1}[/tex]
= A [tex]^{*}[/tex]
= (I + uv*)*
= I + (uv*)*
= I + u*v
= I + [tex]\alpha[\tex], since u*v is a scalar value

As you can see, I am arriving at the wrong conclusion. What am I missing? Thank you!
 
Physics news on Phys.org
  • #2
I note two problems:

(1) I have no idea how you conclude A should be orthogonal.
(2) (AB)* = B*A*
 
  • #3
(1) - I think one of the definition of an Orthogonal matrix is a square matrix, whose columns are all linearly independent so that they make the basis for an m space. One of the properties of a nonsingular matrix is that it has the maximum rank (in this case m since it is a square matrix). These statements led me to conclude our matrix A is orthogonal!

(2) - That is a correct property...did I miss something?
 
  • #4
Codezion said:
(1) - I think one of the definition of an Orthogonal matrix is a square matrix, whose columns are all linearly independent so that they make the basis for an m space.
You forgot a key word in that characterization: they have to make an orthonormal basis.

(2) - That is a correct property...did I miss something?
(Lemme switch to S and T instead of A and B, to eliminate confusion)

Yes. The property is (ST)*=T*S*, but you used it in your calculation as (ST)*=S*T*.
 
  • #5
Wow! Thank you I missed that! I was assuming (AB)* = A*B* all along - even after I read your post! Thank you.

If A is not an orthogonal matrix, then I guess I am back to square one :(
 
  • #6
Simple is often best, even if it's just to get ideas. Rather than looking for a clever argument, why not test directly whether or not this alledged inverse to A really is?
 
  • #7
The matrices are inverses if (I+uv*)(I+alpha uv*)x=x for any vector x. Expand it out and try to figure out a value of alpha that will make that true.
 
  • #8
Thank you! Figured it out :)

(I + uv*) (I + alpha uv*)x = x
(I + alpha Iuv* + Iuv* + alpha uv*uv*)x = x
v*u is a scalar and let it be beta
(I + alpha Iuv* + Iuv* + alpha beta uv*)x = x
[I + Iuv*(alpha + 1 + alpha beta)]x = x
this implies (alpha + 1 + alpha beta) = 0!
alpha = -1/(1+beta)!
 
  • #9
Note that x was really just a placebo. You could have gone straight to the definition of inverse, and put an I on the right hand side, without any of the x's, and done exactly the same thing.
 
  • #10
I like the expression (placebo). Yup, thanks Hurkyl. Thanks so much.
 
  • #11
If beta = -1 (which would imply A is singular), what would the nullspace of A be? I got only as far as writing the equation as:

(I+uv*)x = 0
x + uv*x = 0...

Do you have any suggestions?
 
  • #12
Well, if it's going to be at all easy to figure out, it's probably going to be some expression involving u or v. (I+uv*)x = 0 is a linear equation, so I would start by plugging in a few things, and see if I can combine them to make 0.
 
  • #13
Codezion said:
If beta = -1 (which would imply A is singular), what would the nullspace of A be? I got only as far as writing the equation as:

(I+uv*)x = 0
x + uv*x = 0...

Do you have any suggestions?

Sure you don't want to take a crack at this without a hint? It's pretty easy. Remember beta=v*u=(-1).
 
  • #14
Thanks a lot for your help guys.
Dick, I think it could be the pressure:), but I still cannot see it. Here is what I did

v*x is some constant beta,
x + uv*x = x + beta u
=> x = -beta u
Therefore, I concluded x is a scalar multiple of u.

This is what I came up with, but it looks dodgy because I somewhat defined beta with x itself..?
 
  • #15
OH! I see it now :) if x = u, then

x + uv*x = u + uv*u = u + -u = 0

You are right Dick, it was easy! :)

Thanks a bunch!
 
  • #16
Codezion said:
This is what I came up with, but it looks dodgy because I somewhat defined beta with x itself..?
I think you confused yourself, because you now have two variables named beta! One of them was the inner product of u with v, and the other is the inner product of x with v.

But no matter; you can test directly if u is a right null-vector or not (I now see you've done that). Of course, you'll have to clean up your proof if you want to claim that the right nullspace is one-dimensional.

(p.s. as an alternative, there is a rank-counting argument that tells you that A cannot be less than n-1 dimensional. Can you see it? Hint: it might be easier to think instead of the equation I = A - uv*)
 
Last edited:

What is a nonsingular perturbed identity matrix?

A nonsingular perturbed identity matrix is a square matrix with ones along the main diagonal and small nonzero values in the off-diagonal entries. It is used to represent a small variation or perturbation from an identity matrix.

Why is a nonsingular perturbed identity matrix important in scientific research?

Nonsingular perturbed identity matrices are commonly used in scientific research to model small changes or errors in data or measurements. They are also useful in numerical analysis and optimization algorithms.

How is a nonsingular perturbed identity matrix created?

A nonsingular perturbed identity matrix is created by taking an identity matrix and adding small nonzero values to the off-diagonal entries. These values can be chosen randomly or based on a specific pattern or distribution.

What is the difference between a nonsingular perturbed identity matrix and a singular perturbed identity matrix?

A nonsingular perturbed identity matrix has a nonzero determinant and is therefore invertible, while a singular perturbed identity matrix has a determinant of zero and is not invertible.

In what fields or applications are nonsingular perturbed identity matrices commonly used?

Nonsingular perturbed identity matrices are commonly used in fields such as statistics, engineering, computer science, and physics. They are particularly useful in applications where small variations or errors need to be considered, such as in data analysis, simulation, and optimization.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
562
  • Calculus and Beyond Homework Help
Replies
9
Views
699
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
419
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
767
Back
Top