Office_Shredder said:
The first thing I think that should be pointed out is that you aren't changing the description of a vector space in general, just giving a different way of describing vector subspaces of Rn. Obviously you can't define a vector space in general as the kernel of a linear transformation because a linear transformation doesn't make sense if you don't know what a vector space is already.
Anyway ,back to the task at hand, if my matrix B has rows b1,...,bm, then Bv = 0 means that v is perpendicular to each row of B. So a matrix for which B vj = 0 for each j has rows which are perpendicular to every vj, and then to make sure that the kernel does not have a larger subspace than you want you need to make sure B has enough rows in it - I'll let you think a bit on what procedure you can do to generate B algorithmically (it only requires a standard linear algebra result).
Is this proof correct?, I have found a matrix, whose Kernel contains the spanning set, but I do not yet see how to prove that it does not contain any other vectors.
I assume that the vectors v1, vk are linerly independent.
If b is a row vector in B, then we must have
b*v1=0
b*v2=0
.
.
b*vk=0, where * is matrix multiplication(not dot-product)We can transpose this to.
(v1)'*b'=0
.
.
.
(vk)'*b'=0
And we can put it all together in a Matrix.
|v1...|
|v2...|
|...| * b' = 0(vector)
|...|
|vk...|
When we solve this we get n-k linerly independent b', because the rank of the matrix is k, and the rank-theorem gives dim Null(Matrix)=n-k.
So we can put these b row vectors together to form B, which is then (n-k)*n. And we have B*x =0(vector), if x is in span{v1,...,vk}.
But we also have to show that if vector is not in span{v1,..,vk} than Bx is not 0(vector).
To show this i can use contrapositive proof. That is if Bx=0(vector), then x is in span{v1,..,vk}.
This last part is where I get stuck, do you see how to proceed?