for reaching out for help with this problem! First, let's define the column spaces of A, B, and C. The column space of a matrix is the span of its column vectors. So for matrix A, the column space is the set of all possible linear combinations of its column vectors. Similarly, for B and C, their column spaces are the sets of all possible linear combinations of their respective column vectors.
To show that the column space of C is a subspace of the column space of A, we need to show that every vector in the column space of C can be written as a linear combination of vectors in the column space of A. So let x be a vector in the column space of C. This means that there exists a vector y such that Cx = y. Since C = AB, we have that Ax = Bx = y. This shows that y is in the column space of A, and thus x can be written as a linear combination of vectors in the column space of A. Therefore, the column space of C is a subspace of the column space of A.
For (ii), we can use the fact that the rank of a matrix is equal to the dimension of its column space. So, we have that rank(C) = dim(col(C)) and rank(A) = dim(col(A)). Using part (i), we know that the column space of C is a subspace of the column space of A. This means that the dimension of col(C) is less than or equal to the dimension of col(A). Similarly, we can show that rank(C) is also less than or equal to rank(B). Therefore, we have that rank(C) is smaller than or equal to min{rank(A), rank(B)}, as desired.
I hope this helps clarify the proof for you. Keep practicing and don't hesitate to reach out for help if you need it in the future!