GridironCPJ
- 44
- 0
I don't understand how you're supposed to prove this:
Let A=uvT (vT = v transpose) where u is in R^M and v is in R^N. Prove 2-norm of A = 2 norm of v * 2 norm of u.
I'm not sure if I'm supposed to look at v and u as vectors or what. If they are just vectors, this does not make any sense. I'm assuming the only way this is even possible is if v is a collection of m different vectors each of length n, which would just make v a matrix. Am I missing something here?
Let A=uvT (vT = v transpose) where u is in R^M and v is in R^N. Prove 2-norm of A = 2 norm of v * 2 norm of u.
I'm not sure if I'm supposed to look at v and u as vectors or what. If they are just vectors, this does not make any sense. I'm assuming the only way this is even possible is if v is a collection of m different vectors each of length n, which would just make v a matrix. Am I missing something here?