Closest approach of two skew lines in R3

  • Thread starter Thread starter E'lir Kramer
  • Start date Start date
  • Tags Tags
    Approach Lines
E'lir Kramer
Messages
73
Reaction score
0
Hello all, and thanks again to all the help I've been getting with this book. This is a two part problem in Advanced Calculus of Several Variables, C. H. Edwards Jr. I have the first part and the second part should be easy, but I find I'm stumped.

Since the second part builds on the solution of the first part, I'll give both problem statements and the proof of my first part.

II.1.2a: Let f : \Re \to \Re^{n} and g : \Re \to \Re^{n} be two differentiable curves with f'(t) ≠ 0 and g'(t) ≠ 0 for all all t \in \Re. Suppose the two points p = f(s_{0}) and q = g(t_{0}) are closer than any other pair of points on the two curves. Then prove that the vector p - q is orthogonal to both velocity vectors f&#039;(s_{0}), g&#039;(t_{0}). Hint: the point (s_{0}, t_{0}) must be a critical point for the function \rho : \Re^{2} -&gt; \Re defined by \rho(s, t) = \left \|<br /> f(s) - g(t)<br /> \right\|^{2}

From an earlier problem I have proved three lemmas to my satisfaction:

Lemma 1: &lt;a,a&gt; - &lt;b,b&gt; = &lt;a+b, a-b&gt;

Lemma 2: If n(t) = &lt;c, f(t)&gt; for some constant c, then n&#039;(t) = &lt;c, f&#039;(t)&gt;

Lemma 3: if n(t) = &lt;f(t), f(t) &gt; = \left \| f(t) \right \|^{2}, then n&#039;(t) = 2&lt;f&#039;(t), f(t)&gt;.

Now the problem statement formally is that we must show that &lt;p-q, f&#039;(t)&gt; = 0, &lt;p-q, g&#039;(t)&gt; = 0.

This is just a matter of algebra and the observation that \rho`(s_{0}, t_{0}) = 0 with respect to either partial derivative. We get this from the hint, but the hint was unnecessary.

\rho = &lt;f(s), f(s)&gt; + &lt;g(t), g(t)&gt; - 2&lt;f(s), g(t)&gt;

Applying lemmas 2 and 3, and differentiating first wrt s:
\frac{d\rho}{ds}(s, t) = 2&lt;f&#039;(s), f(s)&gt; - 2&lt;f&#039;(s), g&#039;(t)&gt; \\<br /> = 2&lt;f(s) - g(t), f&#039;(s)&gt; \\<br /> \frac{d\rho}{dt}(s, t) = 2&lt;g&#039;(t), g(t)&gt; - 2&lt;f&#039;(s), g&#039;(t)&gt; \\<br /> = 2&lt;f(s) - g(t), g&#039;(t)&gt;

When we plug in (s_{0}, t_{0}) to these equations, we know that they are equal to 0 by observation that (s_{0}, t_{0}) is a critical point. At the same time, the expression for f(s) - g(t), which appears in both equations, turns into f(s_{0}) - g(t_{0}) = p - q and the equations become 0 = &lt; p - q, g&#039;(t_{0}) &gt; and 0 = &lt; p - q, f&#039;(s_{0})&gt;, QED.

Now the part I can't get is II.1.2b: Apply the result of (a) to find the closest pair of points on the "skew" straight lines in \Re^{3} defined by f(s) = (s, 2s, -s) and g(t) = (t+1, t-2, 2t+3).

So far I just have some aimless restating of facts that I know. All of them are available as results of part A above, except for two:

f&#039;(s) = (1, 2, -1) \\<br /> g&#039;(t) = (1, 1, 2) for all s and t.

Somehow I've got to use these equations in reverse and solve for s_{0} and t_{0}. With that observation I have two equations of two unknowns:

&lt;f(s_{0}) - g(t_{0}), (1, 2, -1)&gt; = 0 = &lt;f(s_{0}) - g(t_{0}), (1, 1, 2)&gt;. But now I am truly stuck. I need to get these variables out of the functions, but g(t) is not linear, so I can't easily construct an inverse function for it.
 
Last edited:
Physics news on Phys.org
Can't you just substitute for f(s0), g(t0) in <f(s0) - g(t0), (1,2,−1)> etc. and expand the inner products? That will give you two equations in two unknowns.
 
Well, the thing is, he hasn't defined the inner product in the problem statement. In this book so far, <> has been a generalization of what he calls the "usual inner product", which I understand is what most people just call the inner product, defined as x \bullet y = x_{1}y_{1} + x_{2}y_{2} + ... + x_{n}y_{n}. In fact he's mentioned nothing of an inner product in the problem statement at all, and the only reason I felt justified in assuming the existence of one is that the definition of orthogonality requires that an inner product be defined, and he's talking about orthogonality.

Would any inner product I chose to define give the same results in this problem?
 
Yes, it will depend on the inner product. Whether two particular vectors are orthogonal depends on the inner product.
 
E'lir Kramer said:
Well, the thing is, he hasn't defined the inner product in the problem statement.
Not explicitly, perhaps, but the second part of the question is clearly to be interpreted as using the Euclidean metric (else it is not solvable).
 
Alright, thanks guys. I'll take it from here and see if I can finish it off. It should be easy now...
 
I'm stuck again. If expand the dot product, I get these equations:

f_{1}(s_{1}) - g_{1}(t_{1}) + 2( f_{2}(s_{2})-g_{2}(t_{2}) ) - f_{3}(s_{3}) - g_{3}(t_{3}) = 0 \\<br /> f_{1}(s_{1}) - g_{1}(t_{1}) + f_{2}(s_{2}) - g_{2}(t_{2}) + 2( f_{3}(s_{3}) - g_{3}(t_{3})) = 0 \\

Which is six variables. I can undo f and g componentwise and have done so. But I simply don't have enough equations to solve this, do I?
 
Why six? It's just s0 and t0 isn't it?
And you can't write out the inner product expansion until after you have substituted for f and g. If f(s)=(s,2s,−s) and v=(1,2,−1) then <f(s), v> = s+4s+s = 6s.
 
Back
Top