Can the Least Squares Method be expressed as a convolution?

Click For Summary
SUMMARY

The discussion centers on expressing the Least Squares Method (LSM) as a convolution. The author reformulates LSM from a summation to an integral form, leading to the equation $$f(x_c) = \int S(x)^2 dx + \int 2S(x)g(x-x_c) + g(x-x_c)^2 dx$$. The middle term represents a convolution of two functions, prompting the author to question the existence of a kernel function. Ultimately, the author expresses uncertainty about proving or disproving the kernel's existence and seeks assistance.

PREREQUISITES
  • Understanding of the Least Squares Method (LSM)
  • Familiarity with integral calculus and binomial expansion
  • Knowledge of convolution operations in mathematical analysis
  • Experience with kernel functions in statistical modeling
NEXT STEPS
  • Research the properties of convolution in functional analysis
  • Study kernel functions and their applications in statistics
  • Explore the relationship between LSM and convolution in signal processing
  • Investigate methods to prove the existence of kernel functions in mathematical contexts
USEFUL FOR

Mathematicians, statisticians, and data scientists interested in advanced statistical methods, particularly those exploring the intersection of convolution and the Least Squares Method.

Daniel Petka
Messages
147
Reaction score
16
Homework Statement
Consider a laser line position estimation by fitting using the Least Square Method (LSM) and prove (or disprove) that it can be considered as a convolution with some function and finding the center by looking for the maximum (zero‐crossing by the derivative). What is the smoothing function?

The Least Square Method (LSM) is defined as:
$$\sum_i[S(x_i)-F(x_i,;a,b,...)]^2=min,$$
where the fitting function is:
$$F(x;y_0,A,x_0,w)=y_0+A\cdot g(x-x_c,w)$$

The fit program will adjust all parameters, but we
are interested only for ##x_c##.

Hint: change sums to integrals in LSM description!
Relevant Equations
fitting function: ##F(x;y_0,A,x_0,w)=y_0+A\cdot g(x-x_c,w)##
convolution: ##f(x)=\int S(x-y)K(y)dy##
Least Squares Method: ##\sum_i[S(x_i)-F(x_i,;a,b,...)]^2=min##
1709981521836.png

I started by converting the LSM from sum to integral form:
$$f(x_c) = \sum_i[S(x_i)-F(x_i,;a,b,...)]^2 to f(x_c) = \int( S(x) - F(x-x_c)^2 dx$$

Since we are not interested in the other parameters (like offset), I assumed that they are fitted correctly and thus ignored them, turning ##F(x-x_c)## directly to ##g(x-x_c)##.

Then I expanded the binomial formula as following:
$$\int S(x)^2 - 2S(x)F(x-x_c) + g(x-x_c)^2 dx$$

And used the linearity of the integral to isolate the part of the equation that doesn't depend on x_0:
$$ f(x_c) = \int S(x)^2 dx + \int 2S(x)g(x-x_c) + g(x-x_c)^2 dx$$
Hence, we have a constant q that isn't affected by the convolution:

$$ f(x_c) = q + \int 2S(x)g(x-x_c) + g(x-x_c)^2 dx$$

The middle term is a convolution og the 2 functions. My idea was to disprove that a Kernel exists, because there is a term that doesn't depend on ##x_c##, but this logic doesn't make any sense after thinking about it. I am completely stuck at this point, since I can neither prove nor disprove that the kernel function exists. Any help would be highly appreciated!
 
Last edited:

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K