Find X Coord of Point A in Dot Product Homework

AI Thread Summary
To find the X coordinate of point A, the discussion centers on using the Pythagorean theorem, specifically the equation A^2 + B^2 = C^2, where the known coordinates of points A and B are provided. The user proposes calculating the hypotenuse using the known Y coordinate of point A and a presumed distance from point B to A. There is also a suggestion to use the scalar product of vectors A and B to determine the angle between them, which could aid in finding the X coordinate. The approach involves confirming the correctness of the calculations and the method used to derive the solution. Ultimately, the discussion emphasizes the need for clarity in applying vector principles to solve for the unknown coordinate.
ptnguyen
Messages
1
Reaction score
0
 

Attachments

  • IMG_3999.JPG
    IMG_3999.JPG
    40.1 KB · Views: 560
  • IMG_4003.JPG
    IMG_4003.JPG
    25 KB · Views: 547
Last edited by a moderator:
Physics news on Phys.org
ptnguyen said:
Im thinking of 31^2+(-231.125^2)= hypotenuse= diagonal from B to A. Then I could solve for x since I have 2 sides of the triangle. But I'm not sure if it correct.

what is the angle between the vectors A and B ?
say the angle is Theta then one can use scalar product of these two tp proceed towards finding x or not ?
 
Back
Top