Solving coupled linear diff. equations

  • Thread starter Thread starter c0dy
  • Start date Start date
  • Tags Tags
    Coupled Linear
c0dy
Messages
15
Reaction score
0

Homework Statement


http://i.imgur.com/zmtJ64Z.jpg

Homework Equations


B is a column vector {{0},{0},{1}}
E is a column vector {{1},{0},{0}}
v(t) is also a column vector {{v_1},{v_2},{v_3}}

The Attempt at a Solution


I have calculated v(t) x B to get (v_2,-v_1,0) and I made a matrix,

| 0 1 0|
|-1 0 0| = A
| 0 0 0|

and now I have dv/dt = A*v + E

Am I going about this correctly? Do I integrate with respect to t? I'm not exactly sure how to solve a problem like this. The assignment is to do this in mathematica but I'd like to understand how to solve the problem first.
 
Physics news on Phys.org
I'm thinking, dv/dt = (v_2, -v_1, 0) + (1, 0, 0) = (v_2 + 1, -v_1, 0)

Then integrate?

v(t) = (t*(v_2 + 1) + c1, -t*v_1 + c2, 0)

=> v(0) = (c1, c2, 0) = (1, -1, 0) => c1 = 1, c2 = -1
 
Last edited:
It might be easier to examine the system writing down each component equation explicitly. Denoting the components of v as x, y and z:

x' = y + 1
y' = -x
z' = 0

You cannot just integrate the right hand side, because it contains unknown functions - except the third one, which you can solve right away. The first two, however, depend on each other. That's why the system is called "coupled".

What you can do, however, is differentiate the first equation and get x'' = y', and substitute y' from the second equation, getting x'' = -x.
 
Aha, that makes sense. Thanks a lot.
 
c0dy said:

Homework Statement


http://i.imgur.com/zmtJ64Z.jpg


Homework Equations


B is a column vector {{0},{0},{1}}
E is a column vector {{1},{0},{0}}
v(t) is also a column vector {{v_1},{v_2},{v_3}}

The Attempt at a Solution


I have calculated v(t) x B to get (v_2,-v_1,0) and I made a matrix,

| 0 1 0|
|-1 0 0| = A
| 0 0 0|

and now I have dv/dt = A*v + E

Am I going about this correctly? Do I integrate with respect to t? I'm not exactly sure how to solve a problem like this. The assignment is to do this in mathematica but I'd like to understand how to solve the problem first.

There are a couple of ways to solve such problems.
(1) Assume a form of solution, and find the parameters that "work".
(2) Use a matrix exponential.

For your system, the z-component is separate and easy, while the x and y components give x' = y, y' = -x.

Method(1): Assuming x = a*exp(r*t) and y = b*exp(r*t) [same r in both!] you have:
x&#039; = r a e^{rt} = y = b e^{rt}\\<br /> y&#039; = r b e^{rt} = -x = -a e^{rt}\\<br /> \text{so}\\<br /> ra = b,\; rb = -a\; \Longrightarrow r^2 a = -a.
If a = 0 then also b = 0, and so 0 is the solution. If a ≠ 0 then ##r^2 = -1##, so ##r = \pm i##. That means that we have solutions involving ##\exp(\pm i t)##, or--in real terms--##\cos(t),\; \sin(t).##

Method (2) The solution of X' = AX (A = constant) is X = C*exp(A*t). If A is a (square) matrix, you need a way to compute the matrix exponential. This can be done using an eigenvalue/eigenvector expansion of the matrix A. In a computer algebra system such as Maple or Mathematica, this can be done at the push of a button. For example, in Maple we have:
A:=Matrix(3,3,[[0,1,0],[-1,0,0],[0,0,0]]):
Et:=MatrixExponential(A,t); <---- compute the exponential of A*t
The answer, in LaTeX form, is:
Et = \pmatrix{\cos(t)&amp;\sin(t)&amp;0\\-\sin(t)&amp;\cos(t)&amp;0\\0&amp;0&amp;1}
 
Ray Vickson said:
There are a couple of ways to solve such problems.
(1) Assume a form of solution, and find the parameters that "work".
(2) Use a matrix exponential.

For your system, the z-component is separate and easy, while the x and y components give x' = y, y' = -x.

Method(1): Assuming x = a*exp(r*t) and y = b*exp(r*t) [same r in both!] you have:
x&#039; = r a e^{rt} = y = b e^{rt}\\<br /> y&#039; = r b e^{rt} = -x = -a e^{rt}\\<br /> \text{so}\\<br /> ra = b,\; rb = -a\; \Longrightarrow r^2 a = -a.
If a = 0 then also b = 0, and so 0 is the solution. If a ≠ 0 then ##r^2 = -1##, so ##r = \pm i##. That means that we have solutions involving ##\exp(\pm i t)##, or--in real terms--##\cos(t),\; \sin(t).##

Method (2) The solution of X' = AX (A = constant) is X = C*exp(A*t). If A is a (square) matrix, you need a way to compute the matrix exponential. This can be done using an eigenvalue/eigenvector expansion of the matrix A. In a computer algebra system such as Maple or Mathematica, this can be done at the push of a button. For example, in Maple we have:
A:=Matrix(3,3,[[0,1,0],[-1,0,0],[0,0,0]]):
Et:=MatrixExponential(A,t); <---- compute the exponential of A*t
The answer, in LaTeX form, is:
Et = \pmatrix{\cos(t)&amp;\sin(t)&amp;0\\-\sin(t)&amp;\cos(t)&amp;0\\0&amp;0&amp;1}

I think method 2 is how he wanted us to solve it. We were also using a similar method for solving partial differential equations. Is there a decent textbook which focuses solving ODE or PDEs this way?
 
For example, Arnold's book on ODEs. I am fairly sure that any book on ODEs printed within the last 30 years or so should have a section on matrix methods.
 
Back
Top