HF08
- 39
- 0
Let a,b\inR^{m}, b\neq0 and set \phi(t)=a+tb.
We must show that the angle between \phi(t_{1})-\phi(t_{0}) and \phi(t_{2})-\phi(t_{0})
is 0 or \pi for any t_{0},t_{1},t_{2},\inR with t_{1},t_{2}\neq.
The 0,1, and 2 are supposed to be subscripts next to t, but my Latex is showing is as superscripts? I don't know why.
Here is my work:
I don't know where to start. My guess is we start with \phi(t)=a+tb. Do we use the law of cosines? How can I even begin to show what looks like arbitrary definitions, arbitrary values and show either 0 or \pi? Please
help me. It isn't lack of trying, I need to attack this problem, but I feel more like I am drowning.
Thank You,
HF08
We must show that the angle between \phi(t_{1})-\phi(t_{0}) and \phi(t_{2})-\phi(t_{0})
is 0 or \pi for any t_{0},t_{1},t_{2},\inR with t_{1},t_{2}\neq.
The 0,1, and 2 are supposed to be subscripts next to t, but my Latex is showing is as superscripts? I don't know why.
Here is my work:
I don't know where to start. My guess is we start with \phi(t)=a+tb. Do we use the law of cosines? How can I even begin to show what looks like arbitrary definitions, arbitrary values and show either 0 or \pi? Please
help me. It isn't lack of trying, I need to attack this problem, but I feel more like I am drowning.
Thank You,
HF08