- #1
Nikitin
- 735
- 27
Hi! please help:
a = vector a
b = vector b
Prove that |a|*b + |b|*a halves the angle between two random vectors a and b.
OK guys I need help proving this. I only go to what is equivalent to high-school in the US ( I'm 17 years old) so no advanced math/computer math please.
Here is what I got: From " |a|*b + |b|*a " we see that |a|*b and |b|*a will always be equally long, but they must always go opposite vertical direction.
BUT, what I must prove for my proof to be legit is this: |(|a|*b)| = |(|b|*a)| so basically that these two vectors will always be equally long.
a = vector a
b = vector b
Prove that |a|*b + |b|*a halves the angle between two random vectors a and b.
OK guys I need help proving this. I only go to what is equivalent to high-school in the US ( I'm 17 years old) so no advanced math/computer math please.
Here is what I got: From " |a|*b + |b|*a " we see that |a|*b and |b|*a will always be equally long, but they must always go opposite vertical direction.
BUT, what I must prove for my proof to be legit is this: |(|a|*b)| = |(|b|*a)| so basically that these two vectors will always be equally long.
Last edited: