Invertible Matrix Theorem: Multiple Solutions for Ax=b?

In summary, the Invertible Matrix Theorem states that for an nxn invertible matrix, it is equivalent to have n pivot positions, a non-zero determinant, rank n, or the columns of A forming a basis of Rn. Additionally, the equation Ax=b will have exactly one solution for each b in Rn, and the linear transformation mapping x to Ax will be a bijection from Rn to Rn. However, if A is not invertible, there may be no solutions or an infinite number of solutions for Ax=b. The statement in the theorem can be written as "there exists at least one solution for each b" or "the solution is unique", depending on the emphasis needed.
  • #1
fk378
367
0
General question regarding the Inv. Matrix Thm:

One part of the theorem states that for an nxn invertible matrix, then there exists at least one solution for each b in Ax=b. Why wouldn't it be "there exists at MOST one solution for each b" since every column/row has a pivot. How would there exist more than one solution for each b if the columns span R_n?
 
Physics news on Phys.org
  • #2
Are you sure that's what the theorem says? My book doesn't say "at least one". It says "exactly one". Here's what Wikipedia says:

[PLAIN]http://en.wikipedia.org/wiki/Invertible_matrix_theorem said:
Let[/PLAIN] A be a square n by n matrix over a field K (for example the field R of real numbers). Then the following statements are equivalent:

A is invertible.
A is row-equivalent to the n-by-n identity matrix In.
A is column-equivalent to the n-by-n identity matrix In.
A has n pivot positions.
det A ≠ 0.
rank A = n.
The equation Ax = 0 has only the trivial solution x = 0 (i.e., Null A = {0})
The equation Ax = b has exactly one solution for each b in Rn.
The columns of A are linearly independent.
The columns of A span Rn (i.e. Col A = Rn).
The columns of A form a basis of Rn.
The linear transformation mapping x to Ax is a bijection from Rn to Rn.
There is an n by n matrix B such that AB = In.
The transpose AT is an invertible matrix.
The matrix times its transpose, AT × A is an invertible matrix.
The number 0 is not an eigenvalue of A.
 
Last edited by a moderator:
  • #3
Oh, weird. Yeah my book does say "at least one solution". Thanks for showing me the wiki entry though.
 
  • #4
Well, the statement is still true: if a matrix is invertible, then the equation Ax= b has exactly one solution so it is certainly true that there is at least one solution.

If A is not invertible then Ax= b may have no solutions or an infinite number of solutions.

You book may have some reason for emphasizing "the solution exists" right now rather than "the solution is unique"- both of which are true for A invertible.
 
  • #5
Pyae said:
Does anyone know how to prove the following theorems:

1) Ax = b is consistent for every n x 1 matrix b

2) Ax = b has exactly one solution for every n x 1 matrix b

3) Ax = b has exactly one solution for at least one n x 1 matrix b

Pls... I need help. It's urgent!
First, don't "hijack" someone else's thread for your own question- that's rude. Use the "new topic" button to start your own thread.

Second, go back and reread the question. You can't prove any of those, they are all false. For example, if A is the 0 matrix, "Ax= b" has NO soution for b non-zero and has an infinite number of solutions if b is 0.
 

FAQ: Invertible Matrix Theorem: Multiple Solutions for Ax=b?

What is the Invertible Matrix Theorem?

The Invertible Matrix Theorem, also known as the Fundamental Theorem of Invertible Matrices, is a mathematical theorem that states that a square matrix is invertible if and only if its determinant is non-zero. This theorem is also used to determine if a matrix has an inverse or not.

How is the Invertible Matrix Theorem used in linear algebra?

The Invertible Matrix Theorem is used in linear algebra to determine if a matrix has an inverse, which is crucial in solving systems of linear equations. It is also used to find the inverse of a matrix, which is necessary for various applications such as finding the solution to a system of linear equations and solving differential equations.

What is the importance of the Invertible Matrix Theorem?

The Invertible Matrix Theorem is important in mathematics and other fields such as physics, engineering, and computer science. It allows us to determine if a matrix has an inverse and helps us solve systems of linear equations, which are used in various real-world applications. Additionally, it provides a deeper understanding of matrix operations and properties.

What are the conditions for a matrix to be invertible according to the Invertible Matrix Theorem?

The Invertible Matrix Theorem states that a square matrix is invertible if and only if its determinant is non-zero. This means that for a matrix to be invertible, it must have non-zero determinant and a well-defined inverse. Additionally, the matrix must be a square matrix, meaning that the number of rows is equal to the number of columns.

How can the Invertible Matrix Theorem be proven?

The Invertible Matrix Theorem can be proven using the properties of determinants and the properties of invertible matrices. One approach is to show that if a matrix has a non-zero determinant, then it has a well-defined inverse. Another approach is to show that if a matrix has a well-defined inverse, then its determinant is non-zero. Both of these prove the two parts of the Invertible Matrix Theorem and thus, prove the theorem as a whole.

Back
Top