MATLAB Solving Coupled Differential Equations (in Matlab)

AI Thread Summary
The discussion centers on solving a system of coupled differential equations defined by dy/dt = -a*y(t) + b*x(t) + (m-y(t))*r and dx/dt = a*y(t) - b*x(t) + (n-x(t))*r, where m, n, and r are constants. The user seeks to determine the coefficients 'a' and 'b' using known values of y(t) and x(t) over time, despite lacking direct values for dy/dt and dx/dt. The proposed solution involves analytically solving the linear ordinary differential equations (ODEs) to express (x,y)(t) and then minimizing the square error between the calculated and observed values of x(t) and y(t) using a suitable optimization method, preferably in MATLAB.
ksmanis
Messages
1
Reaction score
0
Hi,

I have the following problem to solve. My system can be defined as a set of coupled differential equations as described below:

dy/dt = -a*y(t) + b*x(t) + (m-y(t))*r;
dx/dt = a*y(t) - b*x(t) + (n-x(t))*r;

where m,n and r are constants. I have the values of y(t) and x(t) for different values of t. The above dy/dt and dx/dt are based on theoretical interpretation (and I do not have their values). For the given set of x(t), y(t), I would like to find the coefficients 'a' and 'b' fitting the above system.

Appreciate your help in solving this problem in any package (preferably matlab)

Kind Regards
Subbu
 
Physics news on Phys.org
The ODE is linear with constant coefficients, so can be solved analytically to obtain (x,y)(t) expressly. One could then attempt to minimize the square error \sum_i \|(x,y)(t_i) - (x_i,y_i)\|^2 in the usual way.
 

Similar threads

Back
Top