MHB What Am I Getting Wrong About Matrices and Operators?

  • Thread starter Thread starter ognik
  • Start date Start date
  • Tags Tags
    Matrices
Click For Summary
The discussion centers on confusion regarding the relationship between matrices and operators, particularly in the context of a course. The author seeks clarification on specific terminology and applications, noting discrepancies in the use of terms like "symmetric" and "self-adjoint." They emphasize the importance of focusing on complex cases, as real numbers are seen as special cases of complex numbers. Additionally, the author highlights the challenges posed by infinite-dimensional spaces and the role of inner products in defining adjoints. Ultimately, they request specific feedback on their compiled table, which was deemed incorrect by their professor.
ognik
Messages
626
Reaction score
2
I thought I had this clear, then I met operators and - at least to me - the new information overlapped with, and potentially changed, that understanding. Research on the web didn't help as there seem to be different uses & opinions ...

So what I am trying to do is NOT make a summary of what things are, but simply what applies to what - in terms of the course I am doing. So it would be incredibly helpful to me at this time, if you could look at the attached PDF and tell me what is wrong with it - with the explanation below in mind.

Note that I have tried to stick to my book's notation, which uses * for complex conjugate and $ \dagger $ for hermitian.
Also this is not a summary at all of what these are or do, just wanting to be sure what applies to what.

For example my book talked earlier about symmetric matrices, but then used 'self-adjoint' for the equivalent (real) operators. I understand that many operators are matrices, but this application of terminology made sense to me because, again for example, it didn't make sense to talk of an operator that wasn't a matrix as being symmetric (even though it can be treated as such - hope I am making myself clear )
 

Attachments

Physics news on Phys.org
A couple of quick thoughts:

1. Focus on the complex case. The real counterparts are just "special cases" of the complex case, since real numbers are self-conjugate as complex numbers.

2. In the infinite-dimensional (operator) case, we no longer have a compact numerical representation of our vectors. Here is where inner products come to our rescue: we take PROPERTIES of the adjoint in the finite-dimensional case, and USE these to DEFINE the adjoint in the infinite-dimensional case. Now we don't need a finite basis.
 
Thanks, appreciated. We are only just getting to infinite dimensional stuff ...

The thing is, I showed this table to my professor who said it was completely wrong - but I thought I had been quite careful in compiling it. I would really appreciate knowing specifically what is wrong with it?
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K