In my hw problem, I am supposed to find out whether an element belongs to the rowspace of a matrix. So, what I did is to determine the (row)basis of the matrix, dimension of it being one row less of the rows of the original matrix. So, instead of the linearly-dependent row I put the element and if the system turns out to be inconsistent I assume that it does not belong to the row space. Is it correct to assume that? If I use dependency equation for the rows and the new row element would that give the same result? When I used it I got there are no solutions at all, I am not sure what it means because for being lin. indep. there has to be one solution: 0. Thanks in advance. P.S. I decided to post the problem itself afterall: 2 1 3 1 1 1 3 0 0 1 2 1 3 3 8 2 and I need to determine whether X = [4, 1, 2, 5] and Y = [1, 2, 3, 4] belong to row space of the matrix. The answer is X does, but Y does not.