# Linear Algebra: Positive Matrix

1. Jun 30, 2008

### LPB

1. The problem statement, all variables and given/known data

Let T be a positive operator on a Hilbert space H. Pick an orthonormal basis {e1,e2,...,en} for H. Let A=[aij] be the nxn matrix representation of T with respect to the basis {e1,e2,...,en}, so that Tej=$$\sum$$aijei, j=1,2,...,n. (The summation is from i=1 to n; I'm not sure how to show that on here.)
Show that A is a positive matrix,; i.e. for all x$$\in$$C, x*Ax$$\geq$$0.

3. The attempt at a solution

If T is a positive operator, then T must be self-adjoint (so T*=T), and <Tx,x>$$\geq$$0 for all x$$\in$$H.

I'm still new to linear algebra, so the solution may be obvious ... I just don't see what to do. If someone could explain this, using simple language, I would much appreciate it. Thank you!
1. The problem statement, all variables and given/known data

2. Relevant equations

3. The attempt at a solution

2. Jun 30, 2008

### Dick

You've got me confused. Isn't the definition of a positive operator T, <x,Tx>>=0 for all x? If A is a matrix representation of T, then isn't <x,Ax>=<x,Tx>? Where the x in the left side is the abstract vector in the Hilbert space and on the right side is the column vector in the {e_i} basis? Are you really new to linear algebra? And I don't see why a positive operator needs to be self adjoint. Am I in over my head here?

3. Jun 30, 2008

### LPB

The definition given in my textbook for a positive operator is:
Let T$$\rightarrow$$H be a linear operator. Then T is called positive if T is self-adjoing and <Tx,x> $$\geq$$ 0 for all x $$\in$$ H. We write T $$\geq$$ 0 for any positive operator.

I'm taking a course that suggests two semesters of linear algebra as a prerequisite, and this is one of the problems in the "review of linear algebra." However, I haven't actually taken linear algebra, so I'm trying to teach myself as I go. So sometimes I miss things that are very obvious, since I'm not yet used to the terminology, etc. For instance, I'm still confused about the role of * ... I know that it shows that a matrix is adjoint, so it's the conjugate transpose, but to be honest, I'm not quite sure what x*Ax means.

I hope this helps... Thanks for your help!

4. Jun 30, 2008

### Dick

Ok, if it's defined to be self adjoint, then I guess it's self adjoint. x*Ax (and I'd write that as (x*)Ax) means the conjugate transpose of x (a column vector) times the matrix A times x. So x* is a row vector. In the inner product notation this is <x,Ax>. Since A is matrix representation of T, that means <x,Ax>=<x,Tx> (with subtly different meaning for x, as I said). If T is self adjoint, <x,Tx>=<Tx,x>. That T is self adjoint means A=A*. This is a nasty question if you are coming in without the background you need. Because it's heavy on definitions and light on substance. If you understand all of the definitions, there's not much to prove.

5. Jul 1, 2008

### LPB

Thank you - I think that makes sense! The explanation of (x*)Ax was especially helpful. I've rewritten the steps so I think they make sense to me. The basic "flow" that I ended up with is:

(x*)Ax = <x,Ax> = <x,Tx> = <Tx,x> $$\geq$$ 0

Does this seem right?

Again, thank you - I really appreciate your help!

6. Jul 1, 2008

### Dick

That's how I understand it.