1. The problem statement, all variables and given/known data If A has eigenvalues 0 and 1, corresponding to the eigenvectors (1,2) and (2, -1), how can one tell in advance that A is self-adjoint and real. 2. Relevant equations e=m^2 3. The attempt at a solution I can show that A is real: it has real orthogonal eigenvectors and real eigenvalues which form a basis of R2, so any real vector is transformed by A to another vector in R2. It follows that all of the components of A are real. Showing that the matrix is self-adjoint, however, is trickier for me. I know that eigenvectors corresponding to distinct eigenvalues of a self-adjoint matrix are orthogonal, and clearly these eigenvectors are orthogonal. However, I don't think is is a sufficient condition. If a matrix has two orthogonal eigenvectors, surely that doesn't mean it is self-adjoint, right? So how can you know at a glance (that is, without reconstructing A by finding the unitary operators formed by taking its eigenvectors as columns), that A is self-adjoint?