Least Square Fit of Multiple Measurements with Shared Parameters

AI Thread Summary
The discussion focuses on performing a least squares fit of multiple datasets using nonlinear functions with shared parameters in Mathematica. The user seeks an algorithm, such as Gauss-Newton, to fit several functions in parallel while maintaining a common parameter across them. Suggestions include minimizing the sum of the squared differences of all functions, but challenges arise in generating the Jacobian matrix for this approach. Alternatives like the downhill simplex method are mentioned as they do not require Jacobian calculations. The user expresses gratitude for the insights and feels equipped to proceed with their fitting problem.
Sebastian.de
Messages
3
Reaction score
0
Dear All,

I would like to least square fit a number of measurements using several nonlinear functions with shared parameters (similar to the advanced fitting in Origin) using Mathematica.
Therefor I would be interested in an Algorithm like Gauss-Newton that fits several data-set to several functions (with shared parameters) "in parallel".
For example, I would like to fit the set of functions f1=a*exp(c1/x) ... fn=a*exp(cm/x) to a number of giving data-sets y1(x1,...,xm) ... yn(x1,...,xm). As indicated all "a" in f1...fn should have the same value while c1...cn differ.
Unfortunately, I can not find a suited algorithm on the Internet and I have a hard time adopting the algorithms fitting one function to one set of data.
I would appreciate any help, links or references that cover the given topic.

Thank you very much for your help!


Sebastian
 
Mathematics news on Phys.org


I may have misunderstood your question, but the simplest thing I can think of is to consider the sum of all your functions \chi^2's as a single number and minimise that over all the parameters in all of your functions. Then this reduces to a regular multi-dimensional minimisation.

This assumes that you are weighting all your functions equally in the fit, but you could add weights to the different \chi^2's if that was relevant to your problem.
 


Thank you very much for your reply. You didn't misunderstand my question. The approach you suggested seems perfectly fine to me. However, when I tried it I got stuck when generating the Jacobi-matrix.
I hardly have any experience in dealing with matrix calculation, so I am not sure if I ran into a big problem or if I am just to stupid to see the obvious.
The algorithms I found dealing with the least linear square problem generally proceeded in a way that minimizes the sum(f(x,parameters)-y)^2 (where f(x) is the fit function and y are the data points) by making up the Jacobi-matrix of (f(x,parameters)-y) with respect to the parameters. The square is then introduced by multiplying the Jacobi-matrix with its transpose.
If I understand your suggestion correct, I would end up minimizing a problem which looks like this: sum over i (sum(fi(xi,parameters-yi)^2). Having multiple sums of squares now instead of just one, I have not managed to figure out anything similar to the Jacobi-matrix used in the simpler problem. As a result, I was not able to use any of the algorithms I found.
I would be very grateful if you could point me towards any literature that deals with this problem in some detail. I am mildly optimistic that I would be able to write a script solving the problem with the help of a book that illustrates a applicable algorithm on an example.
Thanks again for your help!


Sebastian
 


There are multi-dimensional minimisation algorithms that do not require you to calculate the Jacobian. The one that springs to mind is the downhill simplex method, but there are probably others. I've used simplex a fair bit and it usually works well.http://www.gnu.org/software/gsl/manual/html_node/Multidimensional-Minimization.html" is a good implementation of this algorithm in the Gnu Science Libraries
 
Last edited by a moderator:


I happen to be working on an almost identical problem, although in my case the time constants are common and the amplitudes are not. I took the data sets and time shifted and concatenated them so that they became one long data set. I then created an analytic fit function consisting of a sum of exponentials, each multiplied by two UnitStep functions which "turn it on" during the appropriate time period and turn it off for all other times. Each exponential is multiplied by zero for all times except when it's data is active. Finally, I used the FindFit function to fit the single data set to the single fit function. It would be nice if they had a more straightforward way to do this. A quick Google showed me that other math packages, like SigmaPlot, handle this sort of problem directly. I have attached sample code if you are interested.
 

Attachments



Thank you very much to both of you. I think I have all the information I need now to fit the data.
Thanks again!


Sebastian
 
Thread 'Video on imaginary numbers and some queries'
Hi, I was watching the following video. I found some points confusing. Could you please help me to understand the gaps? Thanks, in advance! Question 1: Around 4:22, the video says the following. So for those mathematicians, negative numbers didn't exist. You could subtract, that is find the difference between two positive quantities, but you couldn't have a negative answer or negative coefficients. Mathematicians were so averse to negative numbers that there was no single quadratic...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Thread 'Unit Circle Double Angle Derivations'
Here I made a terrible mistake of assuming this to be an equilateral triangle and set 2sinx=1 => x=pi/6. Although this did derive the double angle formulas it also led into a terrible mess trying to find all the combinations of sides. I must have been tired and just assumed 6x=180 and 2sinx=1. By that time, I was so mindset that I nearly scolded a person for even saying 90-x. I wonder if this is a case of biased observation that seeks to dis credit me like Jesus of Nazareth since in reality...
Back
Top