emma3001
- 41
- 0
If a, b, and c are the x, y and z intercepts of a plane respectively and d is the distance from the origin to the plane, prove that:
1/dsquared = 1/a squared + 1/b squared + 1/c squared
I made three points
A (a, 0, 0)
B (0, b, 0)
C (0, 0, c)
Then I made vector AB [-a, b, 0] and vector BC [0, -b, c]. Then I found the cross product of AB x BC, which gave me a normal vector of [bc, ac, ab]. If these are my A, B and C for the scalar equation of a plane, then:
bcx + acy + abz + D= 0
I solved for D, which is -abc.
Therefore, the scalar equation for the plane is
bcx + acy + abz -abc= 0
Now I am completely stuck as how to use that scalar equation to prove the above equation.
1/dsquared = 1/a squared + 1/b squared + 1/c squared
I made three points
A (a, 0, 0)
B (0, b, 0)
C (0, 0, c)
Then I made vector AB [-a, b, 0] and vector BC [0, -b, c]. Then I found the cross product of AB x BC, which gave me a normal vector of [bc, ac, ab]. If these are my A, B and C for the scalar equation of a plane, then:
bcx + acy + abz + D= 0
I solved for D, which is -abc.
Therefore, the scalar equation for the plane is
bcx + acy + abz -abc= 0
Now I am completely stuck as how to use that scalar equation to prove the above equation.