[I know this is 14 years old, but it's sitting on the unanswered questions list so I felt it worth responding to. The OP has not, of course, shown any justification for their answers, some of which are incorrect (Iand mutually inconsistent).
underacheiver said:
Homework Statement
1. If A is a real symmetric matrix, then there is a diagonal matrix D and an orthogonal matrix P so that D = P T AP.
a. True
b. False
OP's answer: False
This is in fact true. The eigenvalues of a real symmetric matrix are real, and the corresponding eigenspaces are orthogonal with respect to the euclidean inner product (see #2 below). It follows that we can find an orthogonal basis for each eigenspace, and putting these together gives an orthogonal basis for the entire space with respect to which the matrix representation of the linear map is diagonal. Furthermore, the change of basis matrix from the standard basis to the new basis is orthogonal.
2. Given that λi and λj are distinct eigenvalues of the real symmetric matrix A and that v1 and v2 are the respective eigenvectors associates with these values, then v1 and v2 are orthogonal.
a. True
b. False
OP's answer: True
Not sure why the indices on the eigenvalues are i and j and those on the eigenvectors are 1 and 2; I will use the latter thiroughout.
This is true. As A is symmetric, <br />
\lambda_1 \langle v_1, v_2 \rangle = \langle Av_1, v_2 \rangle = \langle v_1, Av_2 \rangle = \lambda_2 \langle v_1, v_2 \rangle and hence <br />
(\lambda_1 - \lambda_2)\langle v_1, v_2 \rangle = 0. Since \lambda_1 \neq \lambda_2, we have that v_1 and v_2 are orthogonal.
3.If T(θ) is a rotation of the Euclidean plane 2 counterclockwise through an angle θ, then T can be represented by an orthogonal matrix P whose eigenvalues are λ1 = 1 and λ2 = -1.
a. True
b. False
OP's answer: True
This is false. The eigenvalues of T(\theta) = \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} are e^{\pm i\theta}.
4. If A and B represent the same linear operator T: U → U, then they have the same eigenvalues.
a. True
b. False
OP's answer: True
This is true: a linear map is a zero of its minimal polynomial (as a function on the space of linear maps); the scalar roots of the minimal polynomial (as a function on the field of scalars) are the eigenvalues. Different matrix representations of the same linear map are similar, and therefore have the same minimal polynomial.
5. If A and B represent the same linear operator T: U → U, then they have the same eigenvectors.
a. True
b. False
OP's answer: False
This is a little confused. Both matrices represent the same linear map, which has one set of eigenvectors. However, if they represent the same map
with respect to different bases then the components of the eigenvectors
with respect to the different bases will not necessarily be identical.
6. If A and B have the same eigenvalues, then they are similar matrices.
a. True
b. False
OP's answer: False
This is indeed false: consider \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} and \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix}, both of which have 1 as their only eigenvalue with algebraic multiplicity 2, but geometric multplicities 2 and 1 respectively.
7. Which of the following statements is not true?
a. Similar matrices A and B have exactly the same determinant.
b. Similar matrices A and B have exactly the same eigenvalues.
c. Similar matrices A and B have the same characteristic polynomial.
d. Similar matrices A and B have exactly the same eigenvectors.
e. none of the above
OP's answer: d.
(d) is indeed false; hence (e) is false also (it is not the case that none of (a) to (d) are not true).
8. Let the n × n matrix A have eigenvalues λ1, λ2 ... λn (not necessarily distinct). Then det(A) = λ1λ2 ... λn.
a. True
b. False
OP's answer: True
This is true; the determinant of a jordan normal form is the product of the diagonal entries, which are the eigenvalues.
9. Every real matrix A with eigenvalues as in problem 8 is similar to the diagonal matrix D = diag [λ1, λ2, ... λn].
a. True
b. False
OP's answer: False
This is false; the answer to #6 again provides a counterexample.
10. Eigenvectors corresponding to distinct eigenvalues for any n × n matrix A are always linearly independent.
a. True
b. False
OP's answer: True
This is true. If a vector v \neq 0 in the eigenspace of \lambda_1 is linearly dependent with vectors in the eigenspace of \lambda_2, then v is in both eigenspaces so that (\lambda_1 - \lambda_2)v = 0; since \lambda_1 \neq \lambda_2 it follows that v = 0.