- #1

Calpalned

- 297

- 6

## Homework Statement

If c = |a|b + |b|a, where a, b, and c are all nonzero vectors, show that c bisects the angle between a and b.

## Homework Equations

Angle between a & b is cos

^{-1}(a dot b)/(|a||b|)

Angle between a & c is cos

^{-1}(a dot c)/(|a||c|)

Angle AC is half of angle AB

## The Attempt at a Solution

Given c = |a|b + |b|a, I plug that into the equation for the angle between a and c. I eventually get (|a||b|)(|a|b + |b|a

^{2}= (2|a||c|)(a dot b). Is this right? I would also like to confirm:

- if (a dot a) is always equal to |a|^2

- how to differentiate absolute value and magnitude as they use the same symbol

- When I got (2|a||c|)(a dot b), do I do regular multiplication or use the dot product?