Screwdriver said:
I can conclude that the columns must be linearly dependent. I know that the rank is the number of linearly independent columns, so if I've got k of those but k < n columns in total, the leftover column(s) will have to be a non-trivial linear combination of the others. That then implies that the x vector in Ax = 0 will be non-trivial as well, right?
Yes, essentially.
You mentioned in the OP something about a "rank-nullity" theorem.
I'm not sure exactly what you meant about "rank-nullity" theorem, but yes, the rank of a matrix is the number of linearly independent columns (or rows), and so if the number of rows is k, then you must have rank(A) <= k.. and if you know k < n (strictly less than), where n is the number of columns, then you can conclude that the columns of the matrix must be linearly dependent.
You can also think of it in terms like the following: suppose you are working in an n-dimensional vector space, and you have a basis (x1,...,xn) consisting of n vectors. That means that if we have:
a_1x_1 + ... + a_nx_n = 0
then necessarily
a_1 = ... = a_n = 0
Now consider another vector x0 from this same vector space, and consider the linear combination:
a_0x_0 + a_1x_1 + ... + a_nx_n = 0
Suppose a_0=0. Then we must necessarily have a_1 = ... = a_n = 0, since the basis is linearly independent.
Suppose now that a_0 \neq 0. Then we can rewrite this equation as:
x_0 = - \left(\frac{a_1}{a_0}x_1 + \frac{a_n}{a_0}x_n\right)
If x0 is not the zero vector, it must therefore be a linear combination of the basis vectors.
In other words, the set of vectors x0 + the basis vectors is linearly dependent.