# Linear Algebra problem (linear equations)

## Homework Statement

Given augmented matrix

$$\left(\begin{array}{ccc}a&1&-1\\2&1&b\end{array}\right)$$

list conditions on a and b such that there is:

i) no solution
ii) infinitely many solutions and
iii) a unique solution

## The Attempt at a Solution

I row-reduced the matrix to

$$\left(\begin{array}{ccc}1&0&\frac{b+1}{2-a}\\0&1&\frac{-2-a}{2-a}\end{array}\right)$$

and ended up with

i) for no solution, a = 0, 2 (since on the steps to row reducing, there was a $$\frac{1}{a}$$ in one of the entries). My textbook, however, says that a = 2, b != -1 are the conditions for no solution. Why b != -1 and why not a = 0?

Now I didn't get a result for unique solution and infinitely many solutions separately.

I only got that $$x1 = \frac{b+1}{2-a}$$ and $$x2 = \frac{-2 - ab}{2-a}$$
which is the correct unique solution except that the book places restrictions on a and b, namely that they must be a = 2, b = -1. Where are these restrictions coming from and wouldn't a = 2 make it have no solution? Also how would one get an infinite amount of solutions in this case? Thanks.

Last edited: