Full Rank Matrix: Determinant Condition | Rank-Nullity Theorem

Click For Summary

Homework Help Overview

The discussion revolves around determining the conditions under which a given 2x2 matrix is of full rank, specifically focusing on the relationship between the determinant and linear independence of the matrix's column vectors.

Discussion Character

  • Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants explore the implications of the rank-nullity theorem and the conditions for a matrix to be nonsingular. Questions arise regarding the relationship between the determinant being zero and the linear dependence of column vectors.

Discussion Status

Some participants have provided insights into the relationship between the determinant and linear independence, while others have questioned the necessity of using determinants in their reasoning. Multiple interpretations of the problem are being explored.

Contextual Notes

There is a correction regarding the matrix elements, which may affect the discussion. Participants are also considering how to approach the problem without relying on determinants.

squenshl
Messages
468
Reaction score
4

Homework Statement


Show that the matrix ##A## is of full rank if and only if ##ad-bc \neq 0## where $$A = \begin{bmatrix}
a & b \\
b & c
\end{bmatrix}$$

Homework Equations

The Attempt at a Solution


Suppose that the matrix ##A## is of full rank. That is, rank ##2##. Then by the rank-nullity theorem, the
dimension of the kernel is ##0##. This implies that there exists an inverse ##A^{-1}## but this will only occur if ##ad-bc \neq 0## otherwise our matrix ##A## will be singular. On the other hand, suppose ##ad-bc \neq 0##. Hence, ##A## is nonsingular and there exists an inverse ##A^{-1}## but this will occur only when the dimension of the kernel is ##0##, that is, of rank ##n = 2##.
 
Physics news on Phys.org
Your matrix has rank 2 iff the column vectors are linearly independent.
Can you show that the determinant of the matrix is zero iff the column vectors are linearly dependent ?
 
Sorry $$A=\begin{bmatrix}
a & b \\
c & d
\end{bmatrix}$$ if that even matters!

How could I do this without using anything on determinants?
 
Show that ad - bc = 0 iff ##\vec u = (a,c) ## and ##\vec v = (b,d)## are linearly dependent
 
Cheers!
 

Similar threads

Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 26 ·
Replies
26
Views
8K