Just perform Gauss-Jordan elimination on the matrix and you see it immediately in the resulting matrix.
Rank is the number of lineraly independent vector of a matrix.
Dimension is the number of lineraly independant vector of vector space.
So rank r should be the dimension of the solution space, isn't it?
Ok, the rows of an m x n matrix A can be considered as vectors in Rn and the columns as vectors in Rm. The row vectors then span a subspace of Rn called the row space of A, and the column vectors span a subspace of Rm called the column space of A.
It turns out (and we will actually show) that the row space and the column space of A have the same dimension, r, and we define the rank of A to be this common dimension, that is: rank A = r. Clearly, rank A ≤ min(m,n).
A solution x of the homogeneous system Ax=0, where 0 lies in Rm, is then a vector in Rn. The set of these solution vectors x is in fact a subspace of Rn, called the null space of A.
Now, if we perform elementary row operations on A, obtaining a new matrix A', the null space is unchanged (which is the entire point with performing such operations). Another way to express this is to say that the linear relations between the columns are unchanged after such operations: so if some column in A is a linear combination of some of the others, then the same is true for the corresponding columns in A', with the same coefficients in the linear combination, and if some columns in A are linearly independent, so are the corresponding columns in A'. We can select a basis for the column space among the columns in A, and the corresponding columns in A' is a basis for its column space. So, although the column space changes by elementary row operations, the dimension of the column space does not change, that is: rank A = rank A'.
It is also easy to see that the row space does not change at all with elementary row operations: every new row is a linear combination of the old rows, and vice versa.
Now, if we perform Gauss-Jordan elimination on A so that A' has reduced row echelon form, then rank A' = rank A = r (unchanged) and the null space of A' and its dimension are the same as for A. Now, it is easy to see that the pivot rows in A' constitutes a basis for the row space of A', and that the pivot columns of A' constitutes a basis for the column space of A'. Since the pivot rows and the pivot columns of A' are equally many (one of each for each pivot element), this gives a proof that the row space and the column space of A have the same dimension (since this holds for A'), namely, rank A = rank A' = r.
Now, from A', we can immediately write down the solutions of A'x=0, which are the same as the solutions of Ax=0: we choose the variables corresponding to non-pivot columns in A' freely as parameters, or coefficients in linear combinations of some basis vectors for the null space of A' (which are obtained from the solutions, there are usually examples of this in text books), and hence of A. Since there are r pivot columns, there are n-r non-pivot columns, and then there are also n-r basis vectors for the null space, so the dimension of the common null space of A' and A is n-r.