1. The problem statement, all variables and given/known data Let A and B be symmetric matrices and X is a vector in the eigenvalue problem AX-λBX=0 a) Show that the eigenvectors are orthogonal relative to A and B. b) If the eigenvectors are orthonormal relative to B , determine C such that (C-λI)X=0, where C is a diagonal matrix. 2. Relevant equations Orthogonal eigenvectors: transpose(Xi)*A*(Xj)=0 and transpose(Xi)*B*Xj=0 Orthonormal eigenvectors: transpose(Xi)*B*Xj=I 3. The attempt at a solution I am able to extract out eigenvalues and eigenvectors from matrices when used to solve systems of equations, etc, but I don't know how I can use that to prove the theorem above. Should I try a set of random 2x2 numerical matrix and try to get values? An explanation of the problem and a basic starting step to complete this problem will be helpful. thanks!