- 374

- 0

**1. The problem statement, all variables and given/known data**

Prove that two vectors are linearly dependent if and only if one is a scalar multiple of the other.

**2. Relevant equations**

**3. The attempt at a solution**This seems at glance to be a fairly easy proof:

Part I Assume that vectors u and v are linearly dependent.

Then c1u + c2v = 0 where c1 and c2 are not both 0

then u = -c2/c1 * v

and v = -c1/c2 * u But this doesn't make sense to me because what if one of c1 or c2 does equal zero?

Part II Assume that u =av and v=bu , where a and b are constants

then u - av = 0 where the coefficient of u is 1 and v - bu = 0 where the coefficient of v is 1.

Therefore u and v are linearly dependent.

I'm struggling a bit with linear algebra proofs, so any critique or suggestions that anyone could offer would be greatly appreciated.