Calculating the Square Root of a Self-Adjoint Positive Definite Matrix

In summary: So we can multiply both sides of the equation by S^{-1}:C = S D S^{-1}So there's the diagonalization. It's a two-step process. First you find the eigenvalues and the eigenvectors, then you use them to construct S and D as above. This works for any square matrix. (I think I got this right.) As for the eigenvectors, I don't think they play any kind of role in this procedure, so I think it would be alright if they are not independent at all. What is your thought about it?If the eigenvectors are not independent, then S is not invertible.
  • #1
Redsummers
163
0

Homework Statement


The matrix [tex]C[/tex] is self-adjoint and positive definite on [tex]\mathbb{ C}^2[/tex]. Determine [tex]\sqrt{\mathbb{C}} \in \mathbb{ C}^{2x2} [/tex].

[tex]C=\left(\begin{array}{cc}5&-4i\\4i&5\end{array}\right)[/tex]


The Attempt at a Solution



Characteristic polynomial:
[tex](5-\lambda)^2 - 16 = 0[/tex]

What I have done so far is solve for lambda, giving me 9 and 3 respectively. But then I don't really know what to do with those eigenvalues. Do I have to search the eigenvectors as well? In that case, what should I do with them?

I know that at the end I should get something in the following form to get the square root of the matrix:

[tex]\sqrt{C} = a\cdot\left(\begin{array}{cc}5&-4i\\4i&5\end{array}\right) + b\cdot \left(\begin{array}{cc}1&0\\0&1\end{array}\right)[/tex]

But I don't know how to figure out both a and b.

Any help will be appreciated. I have been looking for tutorials in google, but I couldn't find any regarding the procedure for the square roots of the matrix.
 
Physics news on Phys.org
  • #2
Oh, sorry concerning the eigenvalues is 9 and 1 respectively to solve the characteristic polynomial ^^
 
  • #3
If you can factor the matrix as

[tex]C = SDS^{-1}[/tex]

where D is diagonal, then do you know how to find [itex]\sqrt{C}[/itex]?
 
  • #4
Yea, I do know how to get the diagonal matrix, I guess I'd have to get the 2 eigenvectors and I would be fine, but then again I still don't know what would I have to do with the diagonal matrix to find [tex]\sqrt{C}[/tex] :shrug:
 
Last edited:
  • #5
Well, surely you know how to find [itex]\sqrt{D}[/itex], assuming the entries of [itex]D[/itex] are positive. What happens if you square

[tex]S\sqrt{D}S^{-1}[/tex]?
 
  • #6
Oh, well by doing the square of it, it would just mean to get:
[tex]S\sqrt{D}S^{-1}S\sqrt{D}S^{-1}[/tex]

Leading us to:

[tex]SDS^{-1}[/tex] again. hmm
 
  • #7
Redsummers said:
Oh, well by doing the square of it, it would just mean to get:
[tex]S\sqrt{D}S^{-1}S\sqrt{D}S^{-1}[/tex]

Leading us to:

[tex]SDS^{-1}[/tex] again. hmm

Right, so

[tex]S\sqrt{D}S^{-1}[/tex]

is a square root of

[tex]SDS^{-1}[/tex]

So if you can factor

[tex]C = SDS^{-1}[/tex]

then it's easy to find a square root. (By the way, is it "a" square root or "the" square root? i.e., is it unique?)

So how do you find S and D such that

[tex]C = SDS^{-1}[/tex]?

Hint: eigenvalues and eigenvectors.
 
  • #8
Hmm, I am afraid it may be 'the' square root, since I didn't see anything like this in the Seminar class.

Well to find D, for instance I would just plug the eigenvalues (9 and 1) in the diagonal, and regarding the S, I'd find the eigenvectors and plug them there in columns.
But to be honest in the lecture they gave this procedure (which I don't understand why, even though it looks too easy, I'd like to know why is it done this way...)

9a + b = 3
1a + b = 1

Any suggestions of this rearrangement of equations?
 
  • #9
OK, I worked out what that procedure is doing. This is interesting; I haven't seen it before.

Suppose [itex]D[/itex] is a diagonal matrix consisting of the eigenvalues (which are presumed positive) of [itex]C[/itex], and [itex]\sqrt{D}[/itex] is the square root of [itex]D[/itex].

Then the equation

[tex]aD + bI = \sqrt{D}[/itex]

is essentially two equations and two unknowns (since the off-diagonals of [itex]D[/itex], [itex]I[/itex], and [itex]\sqrt{D}[/itex] are zero). Suppose [itex]a,b[/itex] is a solution to this equation. Then multiply both sides of the equation, first on the left by S and then on the right by [itex]S^{-1}[/itex], where S is a matrix of linearly independent eigenvectors corresponding to the eigenvalues in D, assuming such a matrix S exists.

Then:

[tex]aSDS^{-1} + bSIS^{-1} = S\sqrt{D}S^{-1}[/tex]

or equivalently

[tex]aC + bI = \sqrt{C}[/tex]

Pretty slick, although it only works for 2 x 2 matrices. It's important to establish what, if any, conditions are needed to ensure that a solution [itex]\{a,b\}[/itex] exists. Does it work, for example, if the matrix has only one eigenvalue? Also, do there necessarily exist two independent eigenvectors in that case?
 
  • #10
Oh whoa, you haven't seen this before and you just figured that out! That's impressive. I see it now why is it that way, so it could have basically be done by the way you were telling me before right? the diagonalization method is actually more general than this one, because as you remarked this one only works on 2x2 matrices.

Anyway, thanks really! that helped a lot.

jbunniii said:
Does it work, for example, if the matrix has only one eigenvalue? Also, do there necessarily exist two independent eigenvectors in that case?

In the case that we only had one eigenvalue, I don't think we would be able to use such a method, I mean I don't see how we would.
As for the eigenvectors, I don't think they play any kind of role in this procedure, so I think it would be alright if they are not independent at all. What is your thought about it?

Edit: So yeah, I guess it wouldn't work in a Jordan canonical form of diagonalization.
 
Last edited:
  • #11
Redsummers said:
Oh whoa, you haven't seen this before and you just figured that out! That's impressive. I see it now why is it that way, so it could have basically be done by the way you were telling me before right? the diagonalization method is actually more general than this one, because as you remarked this one only works on 2x2 matrices.

Yes, the diagonalization method is more general. It works as long as
(1) all the eigenvalues are positive
(2) there exists a full linearly independent set of eigenvectors.

Condition (2) is guaranteed if all the eigenvalues are distinct, but that's not a necessary condition. E.g. the identity matrix has only one eigenvalue, 1, but it has a full set of eigenvectors. Your matrix is Hermitian, so it is guaranteed to have a full set of eigenvectors. (This is a consequence of the spectral theorem, for instance.)

The way it works is as follows. For each eigenvalue/eigenvector pair, we have

[tex]C v_i = \lambda_i v_i[/tex]

where [itex]\lambda_i[/itex] is the eigenvalue and [itex]v_i[/itex] is the eigenvector.

We can create a matrix S whose columns are the [itex]v_i[/itex]'s:

[tex]S = [v_1 v_2 \ldots v_n][/tex]

and we can express all the eigenvalue/eigenvector relations simultaneously as follows:

[tex]C S = S D[/tex]

where D is the diagonal matrix whose i'th diagonal element is [itex]\lambda_i[/itex]. (To see why the matrices are multiplied in the order shown, multiply them out elementwise.)

Now, S is invertible because its columns are linearly independent. Thus we can solve for C:

[tex]C = SDS^{-1}[/tex]

So that's how you obtain the diagonalized form. Then, as I explained earlier, the square root is simply

[tex]\sqrt{C} = S \sqrt{D} S^{-1}[/tex]

I hope that's clear!
 
  • #12
Yeah! that was just the perfect tutorial I was looking for.

So what's the difference that you were trying to discern before when saying 'a square root of a matrix' and the 'uniqueness square root of a matrix'? Are we dealing here with 'a square root' one, then?
 
  • #13
Redsummers said:
Yeah! that was just the perfect tutorial I was looking for.

So what's the difference that you were trying to discern before when saying 'a square root of a matrix' and the 'uniqueness square root of a matrix'? Are we dealing here with 'a square root' one, then?

What I was asking was a "food for thought" question: should I say I have found *a* square root or *the* square root? In other words, if we find a square root, is it unique? Can there be two different matrices, say M and N, such that

[tex]M^2 = C[/tex]

and

[tex]N^2 = C[/tex]?

The answer is that there are indeed multiple square roots, depending on what sign you choose for each of the elements of [itex]\sqrt{D}[/itex].

For example, in your case

[tex]D = \left[\begin{array}{ll} 9 & 0 \\ 0 & 1\end{array}\right][/tex]

and you can choose [itex]\sqrt{D}[/itex] to be any of the following four matrices:

[tex]\left[\begin{array}{cc} 3 & 0 \\ 0 & 1\end{array}\right][/tex]

[tex]\left[\begin{array}{cc} -3 & 0 \\ 0 & 1\end{array}\right][/tex]

[tex]\left[\begin{array}{cc} 3 & 0 \\ 0 & -1\end{array}\right][/tex]

[tex]\left[\begin{array}{cc} -3 & 0 \\ 0 & -1\end{array}\right][/tex]

It's easy to see that these four matrices result in four distinct square roots of C.

The natural question is, then, are there any other square roots of C? (Perhaps ones that can be obtained by some means other than diagonalization.) I wasn't sure when I asked the question. After a little research, it turns out that there are not any others, and I even have a proof in front of me in Horn and Johnson, "Matrix Analysis," but it uses Lagrange interpolating polynomials in the proof, and I'm going to have to do a little background reading to understand what the heck they're doing. I wouldn't worry about it if I were you, unless you want to, of course. :smile:
 
  • #14
Haha, 'Lagrange interpolating polynomials' that sounds fancy, can't wait to do that on class. I see now your point of the uniqueness square root or, in our case, multiple square roots.

Thanks jbunnii!
 
  • #15
jbunniii said:
OK, I worked out what that procedure is doing. This is interesting; I haven't seen it before.

Suppose [itex]D[/itex] is a diagonal matrix consisting of the eigenvalues (which are presumed positive) of [itex]C[/itex], and [itex]\sqrt{D}[/itex] is the square root of [itex]D[/itex].

Then the equation

[tex]aD + bI = \sqrt{D}[/itex]
Can you enlighten me as to how you came up with this equation, and why it's applicable only to 2x2 matrices?
jbunniii said:
is essentially two equations and two unknowns (since the off-diagonals of [itex]D[/itex], [itex]I[/itex], and [itex]\sqrt{D}[/itex] are zero). Suppose [itex]a,b[/itex] is a solution to this equation. Then multiply both sides of the equation, first on the left by S and then on the right by [itex]S^{-1}[/itex], where S is a matrix of linearly independent eigenvectors corresponding to the eigenvalues in D, assuming such a matrix S exists.

Then:

[tex]aSDS^{-1} + bSIS^{-1} = S\sqrt{D}S^{-1}[/tex]

or equivalently

[tex]aC + bI = \sqrt{C}[/tex]

Pretty slick, although it only works for 2 x 2 matrices. It's important to establish what, if any, conditions are needed to ensure that a solution [itex]\{a,b\}[/itex] exists. Does it work, for example, if the matrix has only one eigenvalue? Also, do there necessarily exist two independent eigenvectors in that case?
 
  • #16
Mark44 said:
Can you enlighten me as to how you came up with this equation, and why it's applicable only to 2x2 matrices?

I didn't really come up with it, I reverse-engineered it from Redsummers' question.

Certainly I am free to write the equation as stated, and if it happens to have a solution [the solution doesn't have to be unique], then it gives me a way to express [itex]\sqrt{D}[/itex] as a linear combination of [itex]D[/itex] and [itex]I[/itex]. And as I showed, the same coefficients allow me to express the square root of the original matrix, [itex]\sqrt{C}[/itex], as a linear combination of [itex]C[/itex] and [itex]I[/itex].

There are two unknowns, [itex]a[/itex] and [itex]b[/itex], so in general if there is to be a solution I must have two (scalar) equations, which is why it only works for 2x2 matrices. If the matrices were larger, I would have an overdetermined system.

There might be a generalization to n x n matrices if you postulate a linear combination of D, I, and [itex]n-2[/itex] other specific diagonal matrices, but I don't know how useful it would be.

The main advantage in the 2x2 case would seem to be that although you have to find eigenvalues of the original matrix to form D, you don't need to find eigenvectors, put them into a matrix S, and invert S, but instead you have to solve a system of 2 equations with 2 unknowns. I guess it's slightly faster, but I'm not sure I really see the point, other than that it's a kind of neat trick.

P.S. I haven't thought very carefully about exactly what conditions must hold in order for this trick to work, but certainly it requires
(1) a square root must exist
and
(2) a solution [itex]a,b[/itex] to diagonal matrix equation must exist
and
(3) the original matrix must be diagonalizable (in order to conclude that the same linear combination works for [itex]C[/itex] and [itex]\sqrt{C}[/itex]).

A sufficient condition would be distinct, positive eigenvalues, but it's not necessary (e.g., it works for C = I).
 
Last edited:
  • #17
Hey i just tried to calcuculate the eigenvectors.
But i can't get a sol. The result is always = 0.
??

Edit: Forget it, using the way with a and b, i was able to solve it correct ^^ (without any eigenvectors)
 
Last edited:

1. What is the square root of a matrix?

The square root of a matrix is a matrix that, when multiplied by itself, results in the original matrix.

2. Is the square root of a matrix always unique?

No, the square root of a matrix is not always unique. In fact, a matrix can have multiple square roots.

3. How is the square root of a matrix calculated?

The square root of a matrix is calculated using eigenvalue decomposition or singular value decomposition.

4. Can a matrix have a negative square root?

Yes, a matrix can have a negative square root. This is because the square root of a matrix is not defined by the traditional concept of square roots in mathematics.

5. What are some real-world applications of the square root of a matrix?

The square root of a matrix has various applications in fields such as engineering, physics, and computer science. Some examples include image processing, data compression, and system stability analysis.

Similar threads

  • Calculus and Beyond Homework Help
Replies
3
Views
221
  • Calculus and Beyond Homework Help
Replies
6
Views
474
  • Calculus and Beyond Homework Help
Replies
6
Views
195
  • Calculus and Beyond Homework Help
Replies
2
Views
267
  • Linear and Abstract Algebra
Replies
2
Views
959
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
3K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
674
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Back
Top