- #1
masterofthewave124
- 74
- 0
i have to prove the diagonals of a rhombus intersect at right-angles using the scalar dot product.
i have set up a cartesian plane system where A lies on the origin. numbering the sequential points clockwise, i let B = (a,b), C = (a+c, b) and D = (c,0). I then thought if i set up vectors AC and DB and found the dot product between them, i could get a value of zero. unfortunately that's not the case so where did i go wrong?
i have set up a cartesian plane system where A lies on the origin. numbering the sequential points clockwise, i let B = (a,b), C = (a+c, b) and D = (c,0). I then thought if i set up vectors AC and DB and found the dot product between them, i could get a value of zero. unfortunately that's not the case so where did i go wrong?