Minimizing Distance Between Two Lines

In summary, the problem is to find the equations that would minimize ||x1a1-x2a2-b||, where L1 is the line through the origin in the direction of a1 and L2 is the line through b in the direction of a2. This involves finding a vector c so that (c-x1a1+x2a2+b)^2 is the minimum, which can be done by taking the derivative of the equation and setting it equal to zero. There is also a mention of finding the projection of x2a2+b onto a1, but it is later clarified that this is not necessary.
  • #1
bodensee9
178
0
Can someone help with the folowing?
Suppose L1 is the line through the origin in the direction of a1 and L2 is the line through b in the direction of a2. I am supposed to find the closest points x1a1 and b+x2a2 on the two lines.

So I am trying to find the equations that would minize ||x1a1-x2a2-b||.

Not really sure what equations to write. I know that I'm trying to find some vector c so that (c-x1a1+x2a2+b)^2 will be the minimum. This means that if I take the derivative of the above, then the derivative will be zero. So, if I break c down into its components, would I get

2(c1-x1a1)+2(c2+x2a2+b)=0? Or, would I be trying to find the projection of x2a2+b onto a1? And if I do that, would be projection from a2 onto a1 be a1(a1Ta1)-1aT? But what about for a2? Not sure if that's right either. Thanks.
 
Physics news on Phys.org
  • #2
Figured out. misunderstood the problem at first. thanks.
 

1. What is the Least Squares Problem?

The Least Squares Problem is a mathematical technique used to find the best-fit line or curve for a set of data points. It involves minimizing the sum of the squared differences between the actual data points and the predicted values from the line or curve.

2. When is the Least Squares Method used?

The Least Squares Method is commonly used in various fields such as statistics, econometrics, and data analysis. It is used when there is a need to find the best-fit line or curve for a set of data points, and when there is uncertainty or variability in the data.

3. How is the Least Squares Problem solved?

To solve the Least Squares Problem, an optimization algorithm is used to find the values of the parameters that minimize the sum of the squared differences between the actual data points and the predicted values. This is typically done through a process called gradient descent, where the algorithm iteratively adjusts the parameters to reach the minimum value.

4. What is the difference between linear and non-linear least squares?

In linear least squares, the best-fit line or curve is a linear function of the parameters, while in non-linear least squares, the best-fit curve is a non-linear function of the parameters. This means that the optimization algorithm used to solve the problem will be different for linear and non-linear least squares.

5. What are the applications of the Least Squares Problem?

The Least Squares Problem has many applications in various fields such as finance, engineering, and science. It is used for data fitting, regression analysis, time series forecasting, and parameter estimation, among others. It is also the basis for many statistical models and machine learning algorithms.

Similar threads

  • Calculus and Beyond Homework Help
Replies
13
Views
1K
  • Classical Physics
Replies
4
Views
872
  • Calculus and Beyond Homework Help
Replies
2
Views
363
  • Calculus and Beyond Homework Help
Replies
17
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
557
  • Thermodynamics
Replies
1
Views
3K
Back
Top