How to check if a matrix is Hilbert space and unitary?

  • #1
I have a matrix,

[ a, ib; -1 1]

where a and b are constants.

I have to represent and analyse this matrix in a Hilbert space:

I take the space C^2 of this matrix is Hilbert space. Is it sufficient to generate the inner product:

<x,y> = a*ib -1

and obtain the norm by:

\begin{equation}
||x|| = ia
\end{equation}

to confirm its a unitary matrix in Hilbert space?

However the criteria:
\begin{equation}
<x,y> = a*ib -1 = \overline{<x,y>} = -a*b -i
\end{equation}

is not met.


Instead, if I consider a Hilbert sequence space l^2, then I get:

<x, y> = -ab

where the norm is:

\begin{equation}
||x|| = <x,x>^{1/2} = a
\end{equation}

Is this correct, and if it is, can it be followed up by further analysis of the matrix in Hilbert space?

Thanks!
 

Answers and Replies

  • #2
StoneTemplePython
Science Advisor
Gold Member
1,260
597
There are a lot of problems here.

If your scalars are in ##\mathbb C## and you are in finite dimensions, you have a standard inner product and automatically are in a special type of Hilbert space. It's not really clear what you're asking though.

Now for your post:


A big issue:
and obtain the norm by:

\begin{equation}
||x|| = ia
\end{equation}

It is not remotely clear how this "norm" qualifies as positive definite.


With respect to preserving L2 norms and the standard inner product, it is worth pointing out that:
##
\begin{bmatrix}
a & bi \\
-1 & 1
\end{bmatrix}##

can never be called unitary. The squared Frobenius norm of an ##n ## x ##n## unitary matrix is ##n##.

For your matrix, this occurs iff ##a =0## and ##b= 0##, but then your matrix has a determinant of zero, and a unitary matrix always has a determinant with magnitude one.
 
  • #3
The matrix is indeed not unitary. I asked the wrong question in the subject.

What I wanted to ask is:

1. Given that the matrix IS Hilbert space, and its eigenvectors are linearly independent, and eigenvalues are complex conjugates, what more can I say about it in a Hilbert space analysis?

It is really difficult to "visualize" or interpret a Hilbert space geometrically, and therefore it is equally difficult to determine the properties of the matrix that give some notion of its behaviour, either in matrix classification or for instance when regarding it general solution and representation of the solution in a Hilbert space.

In other words, I only know that I have to analyse this matrix, however I am not sure what to analyse more than the above given properties and eigenvalues.
 
  • #4
A big issue:


It is not remotely clear how this "norm" qualifies as positive definite.


Answer:

it does NOT qualify as positive definite (it gives a zero value when checking for positive definiteness
 
  • #5
StoneTemplePython
Science Advisor
Gold Member
1,260
597
This posting still seems to be all over the place.

One suggestion is: if you know your matrix is diagonalizable, that's good. It would be nice to check to see if the matrix is normal (i.e. unitarily diagonalizable). check, does ##AA^* = A^* A##.

That's about all I have to say here. Good luck.
 
  • #6
One suggestion is: if you know your matrix is diagonalizable, that's good. It would be nice to check to see if the matrix is normal (i.e. unitarily diagonalizable). check, does ##AA^* = A^* A##.

That's about all I have to say here. Good luck.


Thanks, that is a concrete measure!


Thanks!
 
  • #7
Stone Temple, the matrix I have does not diagonalize, however its eigenvectors are linearily independent. Can I ALWAYS conclude that a Matrix is Hilbert space C^n, in this case C^2? I am unsure, because it is not unitary, it not diagonilazable, however it has linearily independent eigenvectors.

Can I check if the matrix is complete in some way and by that define it as Hilbert space?
 
  • #8
StoneTemplePython
Science Advisor
Gold Member
1,260
597
Stone Temple, the matrix I have does not diagonalize, however its eigenvectors are linearily independent.

This is a contradiction. An n x n matrix with scalars in ##\mathbb C## has #n# eigenvalues. If you have n linearly independent eigenvectors, then your matrix is diagonalisable.
 
  • #9
This is a contradiction. An n x n matrix with scalars in ##\mathbb C## has #n# eigenvalues. If you have n linearly independent eigenvectors, then your matrix is diagonalisable.


They are not independent in C, but are so in the subspace R.

This is , I think, because both eigenvectors have complex conjugates, thus a real and an imaginary part.

I checked this with Mathematica Algebra tools.
 
  • #10
WWGD
Science Advisor
Gold Member
6,336
8,389
Stone Temple, the matrix I have does not diagonalize, however its eigenvectors are linearily independent. Can I ALWAYS conclude that a Matrix is Hilbert space C^n, in this case C^2? I am unsure, because it is not unitary, it not diagonilazable, however it has linearily independent eigenvectors.

Can I check if the matrix is complete in some way and by that define it as Hilbert space?
I think you're being imprecise here. Don't you mean the matrix is _an element_ of a Hilbert space? Specifically here, as Python said, of ##\mathbb C^2 ##?( I think ; pretty sure,there is only one inner-product that will make ##\mathbb C^2 ## into a Hilbert space. I suggest tightening things to make a more precise statement.)
 
Last edited:
  • #11
StoneTemplePython
Science Advisor
Gold Member
1,260
597
They are not independent in C, but are so in the subspace R.

This is , I think, because both eigenvectors have complex conjugates, thus a real and an imaginary part.

This also sounds wrong. Give me a singe example of 2 n-vectors that exist with scalars in ##\mathbb R## and are linearly independent there, but not when you allow scalars in ##\mathbb C##.
- - - -
Furthermore, to be crystal clear, you originally introduced a matrix with complex scalars (specifically with non-zero imaginary components). It is a huge abuse of language to say it has linearly independent eigenvectors and then "clarify" that you're talking about a subspace in reals in which the original matrix itself doesn't exist.
 
  • #12
This also sounds wrong. Give me a singe example of 2 n-vectors that exist with scalars in ##\mathbb R## and are linearly independent there, but not when you allow scalars in ##\mathbb C##.
- - - -
Furthermore, to be crystal clear, you originally introduced a matrix with complex scalars (specifically with non-zero imaginary components). It is a huge abuse of language to say it has linearly independent eigenvectors and then "clarify" that you're talking about a subspace in reals in which the original matrix itself doesn't exist.


This is all new to me, please try it out on Wolfram Alpha Algebra online. I can give you the vector coordinates here:

e_1 = (1, -1.3394^-66 + 1.1272^-61i)
e_2 = (1, 0.5000 + 1.3229i)

and the matrix is:

|1.0545718^-68, (5.344285879^-28)/2*1.0545718^-34*i|
|-1 , 1|
 
  • #13
This also sounds wrong. Give me a singe example of 2 n-vectors that exist with scalars in ##\mathbb R## and are linearly independent there, but not when you allow scalars in ##\mathbb C##.
- - - -

No, I checked today, and this is an element of L^2[a,b] and is complete in the complex space. Thanks for the clarification, it is indeed an element of Hilbert space, but I have also read that a matrix can BE Hilbert space.
 
  • #14
WWGD
Science Advisor
Gold Member
6,336
8,389

I doubt the matrix itself is a Hilbert space; no matter how you define the (finitely-many) elements of the space, the scalar product of elements will not fall within the space, i.e., the matrix is not closed under scaling, so not a vector space, unless you concoct some esoteric action of scalars on the elements. And you don't have the origin in your set either. Specifically, this is not a vector space, a needed condition to have a Hilbert space.
 
Last edited:
  • #15
I doubt the matrix itself is a Hilbert space; no matter how you define the (finitely-many) elements of the space, the scalar product of elements will not fall within the space, i.e., the matrix is not closed under scaling, so not a vector space, unless you concoct some esoteric action of scalars on the elements. And you don't have the origin in your set either.

Still it can be classified as a matrix in L^2 [a,b] complex space.

The matrix has non-orthogonal elements and is a complex matrix that follows the condition:

\begin{equation}
\langle x,y \rangle = \int_a^bx(t)\overline{y(t)}dt,
\end{equation}


which is complete in the complex vector space $L^2[a,b]$, which satisfies the minimal condition:

\begin{equation*}
x(t)\overline{x(t)}=|x(t)|^2
\end{equation*}

$L^2[a,b]$ is therefore Hilbert space with a set of complex eigenvalues and eigenvectors. \\
 
  • #16
WWGD
Science Advisor
Gold Member
6,336
8,389
Still it can be classified as a matrix in L^2 [a,b] complex space.

The matrix has non-orthogonal elements and is a complex matrix that follows the condition:

\begin{equation}
\langle x,y \rangle = \int_a^bx(t)\overline{y(t)}dt,
\end{equation}


which is complete in the complex vector space $L^2[a,b]$, which satisfies the minimal condition:

\begin{equation*}
x(t)\overline{x(t)}=|x(t)|^2
\end{equation*}

$L^2[a,b]$ is therefore Hilbert space with a set of complex eigenvalues and eigenvectors. \\
Yes, just going over statement you made about the matrix itself being a Hilbert space. But, yes, the space of pair ##<a,b>: a,b \in \mathbb C ## with the inner-product you mentioned is a Hilbert space. as you said; ##\mathbb C^2 ##, and in general ## \mathbb C^n ## are Hilbert spaces.
 
  • #17
Thanks WWGD for confirming!
 

Suggested for: How to check if a matrix is Hilbert space and unitary?

  • Last Post
Replies
3
Views
126
Replies
5
Views
170
Replies
5
Views
400
Replies
6
Views
932
Replies
1
Views
934
Replies
1
Views
485
  • Last Post
Replies
9
Views
737
  • Last Post
Replies
1
Views
787
  • Last Post
Replies
12
Views
631
  • Last Post
Replies
17
Views
461
Top