For matrix A, A*A = 0 implies A = 0

  • Thread starter Thread starter jahlex
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary
SUMMARY

In the discussion, it is established that for a square matrix A, if A^\dagger A = 0, then A must equal 0. The proof utilizes the properties of Hermitian conjugates and orthonormal bases, demonstrating that the matrix elements A_{ij} must be zero. An alternative proof method is suggested, which avoids direct computation of matrix elements by deriving a contradiction from the assumption that Av is non-zero.

PREREQUISITES
  • Understanding of square matrices and their properties
  • Familiarity with Hermitian conjugates in linear algebra
  • Knowledge of inner product spaces and orthonormal bases
  • Basic concepts of linear transformations and their implications
NEXT STEPS
  • Study the properties of Hermitian matrices and their applications
  • Learn about inner product spaces and their significance in linear algebra
  • Explore alternative proof techniques in linear algebra, such as contradiction proofs
  • Investigate the implications of matrix rank and nullity in relation to matrix equations
USEFUL FOR

This discussion is beneficial for students and professionals in mathematics, particularly those studying linear algebra, as well as researchers exploring matrix theory and its applications in various fields.

jahlex
Messages
5
Reaction score
0

Homework Statement



Given that [itex]A[/itex] is a square matrix and [itex]A^\dagger[/itex] is its Hermitian conjugate, and that [itex]A^\dagger A = 0[/itex], show that [itex]A = 0[/itex].

The Attempt at a Solution



Let [itex]\{|i\rangle\}[/itex] be some orthonormal basis. Find the matrix elements of [itex]A[/itex] by taking [itex]0 = \langle j| A^\dagger A |j \rangle = \displaystyle\sum_n \langle j|A^\dagger |n \rangle \langle n|A|j \rangle = \displaystyle\sum_n | \langle n|A|j \rangle |^2 \ge | \langle i|A|j \rangle |^2 = |A_{ij}|^2[/itex], so [itex]A_{ij} = 0[/itex] and [itex]A = 0[/itex].

3. Question

Is there a way to prove the result that does not rely on finding matrix elements?
 
Physics news on Phys.org
Yes, the proof is very similar but does not require picking a specific vector to multiply by. Suppose that Av is not zero. Then by observing
[tex]\left<v| A^{\dagger} A| v \right> = \left<Av| Av \right>[/tex]
You should be able to prove that this number is both zero and non-zero, giving a contradiction.
 
  • Like
Likes   Reactions: 1 person

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K