MHB 243.12.5.26 Show That The Line And The Plane Are Not Parallel.

Click For Summary
The discussion focuses on demonstrating that a given line and plane are not parallel. The line is defined in vector form with a specific direction vector and point, while the plane is represented by a linear equation. It is established that for the line and plane to be non-parallel, they must intersect at some point. The calculations show that there is a value of t that allows for an intersection, confirming they are not parallel. The next step involves finding the intersection point by substituting the determined t value back into the line's equations.
karush
Gold Member
MHB
Messages
3,240
Reaction score
5
$\textsf{Write a complete solution.}\\$
$\textit{Let}$ $$v =\langle 1, 3, − 1 \rangle$$
$\textit{and }$ $$r_0 =\langle 1, 1, 1 \rangle$$
$\textit{and consider the line given by:}\\$ $$r = r_0+tv$$
$\textit{in vector form.}\\$
$\textit{Also, consider the plane given by}$
$$x+2y+2z+2 = 0$$
$\textit{(a) Show that the line and the plane are not parallel.}\\$
$\textit{(b) Find the point on the line at distance 3 from the plane.}\\$

ok just posting this now to come back later to finish it.
to start with...
\begin{align*}\displaystyle
r&= r_0+tv\\
&=\langle 1, 1, 1 \rangle + t\langle 1, 3, − 1 \rangle\\
&=t+1,3t+1,-t+1
\end{align*}
 
Physics news on Phys.org
If a line and plane are not parallel then there must be a point where the line intersects the plane. (Note: two lines in 3 dimensions can intersect, be parallel, or be "skew", neither parallel nor intersecting. That cannot be true of two planes or a plane and a line. They must be parallel or intersecting.)

Here the line is given by x= 1+ t, y= 1+ 3t, and z= 1- t and the plane by x+ 2y+ 2z= -2. So, at a point of intersection, (1+t)+2(1+ 3t)+ 2(1- t)= 1+ t+ 2+ 6t+ 2- 2t= 5t+ 5= -2 so 5t= -7 and t= -7/5. Since there is such a t, there is a point intersection and you can find that point by putting that value of t into the equations of the line.
 
There are probably loads of proofs of this online, but I do not want to cheat. Here is my attempt: Convexity says that $$f(\lambda a + (1-\lambda)b) \leq \lambda f(a) + (1-\lambda) f(b)$$ $$f(b + \lambda(a-b)) \leq f(b) + \lambda (f(a) - f(b))$$ We know from the intermediate value theorem that there exists a ##c \in (b,a)## such that $$\frac{f(a) - f(b)}{a-b} = f'(c).$$ Hence $$f(b + \lambda(a-b)) \leq f(b) + \lambda (a - b) f'(c))$$ $$\frac{f(b + \lambda(a-b)) - f(b)}{\lambda(a-b)}...

Similar threads

Replies
8
Views
3K
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K