# Solving coupled linear diff. equations

1. Mar 4, 2013

### c0dy

1. The problem statement, all variables and given/known data
http://i.imgur.com/zmtJ64Z.jpg

2. Relevant equations
B is a column vector {{0},{0},{1}}
E is a column vector {{1},{0},{0}}
v(t) is also a column vector {{v_1},{v_2},{v_3}}

3. The attempt at a solution
I have calculated v(t) x B to get (v_2,-v_1,0) and I made a matrix,

| 0 1 0|
|-1 0 0| = A
| 0 0 0|

and now I have dv/dt = A*v + E

Am I going about this correctly? Do I integrate with respect to t? I'm not exactly sure how to solve a problem like this. The assignment is to do this in mathematica but I'd like to understand how to solve the problem first.

2. Mar 4, 2013

### c0dy

I'm thinking, dv/dt = (v_2, -v_1, 0) + (1, 0, 0) = (v_2 + 1, -v_1, 0)

Then integrate?

v(t) = (t*(v_2 + 1) + c1, -t*v_1 + c2, 0)

=> v(0) = (c1, c2, 0) = (1, -1, 0) => c1 = 1, c2 = -1

Last edited: Mar 4, 2013
3. Mar 5, 2013

### voko

It might be easier to examine the system writing down each component equation explicitly. Denoting the components of v as x, y and z:

x' = y + 1
y' = -x
z' = 0

You cannot just integrate the right hand side, because it contains unknown functions - except the third one, which you can solve right away. The first two, however, depend on each other. That's why the system is called "coupled".

What you can do, however, is differentiate the first equation and get x'' = y', and substitute y' from the second equation, getting x'' = -x.

4. Mar 5, 2013

### c0dy

Aha, that makes sense. Thanks a lot.

5. Mar 5, 2013

### Ray Vickson

There are a couple of ways to solve such problems.
(1) Assume a form of solution, and find the parameters that "work".
(2) Use a matrix exponential.

For your system, the z-component is separate and easy, while the x and y components give x' = y, y' = -x.

Method(1): Assuming x = a*exp(r*t) and y = b*exp(r*t) [same r in both!] you have:
$$x' = r a e^{rt} = y = b e^{rt}\\ y' = r b e^{rt} = -x = -a e^{rt}\\ \text{so}\\ ra = b,\; rb = -a\; \Longrightarrow r^2 a = -a.$$
If a = 0 then also b = 0, and so 0 is the solution. If a ≠ 0 then $r^2 = -1$, so $r = \pm i$. That means that we have solutions involving $\exp(\pm i t)$, or--in real terms--$\cos(t),\; \sin(t).$

Method (2) The solution of X' = AX (A = constant) is X = C*exp(A*t). If A is a (square) matrix, you need a way to compute the matrix exponential. This can be done using an eigenvalue/eigenvector expansion of the matrix A. In a computer algebra system such as Maple or Mathematica, this can be done at the push of a button. For example, in Maple we have:
A:=Matrix(3,3,[[0,1,0],[-1,0,0],[0,0,0]]):
Et:=MatrixExponential(A,t); <---- compute the exponential of A*t
The answer, in LaTeX form, is:
$$Et = \pmatrix{\cos(t)&\sin(t)&0\\-\sin(t)&\cos(t)&0\\0&0&1}$$

6. Mar 5, 2013

### c0dy

I think method 2 is how he wanted us to solve it. We were also using a similar method for solving partial differential equations. Is there a decent textbook which focuses solving ODE or PDEs this way?

7. Mar 5, 2013

### voko

For example, Arnold's book on ODEs. I am fairly sure that any book on ODEs printed within the last 30 years or so should have a section on matrix methods.