1. The problem statement, all variables and given/known data If c = |a|b + |b|a, where a, b, and c are all nonzero vectors, show that c bisects the angle between a and b. 2. Relevant equations Angle between a & b is cos-1(a dot b)/(|a||b|) Angle between a & c is cos-1(a dot c)/(|a||c|) Angle AC is half of angle AB 3. The attempt at a solution Given c = |a|b + |b|a, I plug that into the equation for the angle between a and c. I eventually get (|a||b|)(|a|b + |b|a2 = (2|a||c|)(a dot b). Is this right? I would also like to confirm: - if (a dot a) is always equal to |a|^2 - how to differentiate absolute value and magnitude as they use the same symbol - When I got (2|a||c|)(a dot b), do I do regular multiplication or use the dot product?