# Eigenvectors of LC-circuit

1. Aug 19, 2010

### valjok

I read that n different eigenvalue matrix has always n eigenvectors. But I cannot find any.

Here is the state transition function, A:

$$\left[\begin{array}{cc}\dot{I}\\ \dot{U}\end{array}\right] = \left[\begin{array}{cc}0&-1/L\\ 1/C&0\end{array}\right] \left[\begin{array}{cc}{I}\\ {U}\end{array}\right]$$

The state variables, I and U are inductor current and capacitor voltage correspondingly. They oscillate. As you see, the matrix is anti-diagonal. The experts say that the "natural" state variables are better, which implies the matrix diagonalization. The diagonal elements are the same eigenvalues, the roots of

$$det(sE-A)= \left|\begin{array}{cc}s&+1/L\\ -1/C&s\end{array}\right| = s^2 + \frac1{LC} = 0$$

They turned out to be λ1,2= ±i/√LC. The trouble starts finding the corresponding eigenvectors. Let's take first eigenvalue i/√LC.

$$\lambda x = Ax$$

which is

$$\frac i{\sqrt{LC}} \left[\begin{array}{cc}{x_1}\\ {x_2}\end{array}\right] = \left[\begin{array}{cc}0&-1/L\\ 1/C&0\end{array}\right] \left[\begin{array}{cc}{x_1}\\ {x_2}\end{array}\right]$$

which is equivalent to

$$\left\{\begin{array}{cc}i x_1/\sqrt{LC} = -x_2/L \\ i x_2 / \sqrt{LC} = x_1/C\end{array}$$.

Expressing $$x_2 = -i x_1 \sqrt{L/C}$$, which looks promising since variables seem to be derivatives of each other, we get

$$x_1/C = i x_2 / \sqrt{LC} = i (-i x_1 \sqrt{L/C})/ \sqrt{LC} = x_1 / C$$

or

1 = 1 which is a tautology.

The same result can be achieved more easily, avoiding the eigenvalues computation, by

$$\lambda \left[\begin{array}{cc}{I}\\ {U}\end{array}\right] = \left[\begin{array}{cc}0&-1/L\\ 1/C&0\end{array}\right] \left[\begin{array}{cc}{I}\\ {U}\end{array}\right]$$

(I look for the eigenvectors)

$$\left\{\begin{array}{ll} \lambda I = -U/L \\ \lambda U = I/C = -U/LC\end{array}$$

which also results in complete cancellation of U:

λ= -1/LC.

What does it mean? The diagonal matrix implies that the state variables are independent - they accumulate something by self-feedback only. I would like to see how it is possible in oscillator, which variables must ping-pong the energy periodically and one determines the speed of change of another.

2. Aug 19, 2010

### zomgineer

What you've done is to show that the eigenvector is logically consistent. What you needed to do is find the homogeneous solution to the equation $$(A - \lambda I)x = 0$$

3. Aug 20, 2010

### JThompson

ix1/√LC=-x2/L and ix2/√LC=x1/C
correspond to the same eigenvalue, thus they are the same set of eigenvectors, which is why you got a tautology. Complex eigenvalues and eigenvectors come in pairs; if you have a complex eigenvector as a solution, then its complex conjugate is also a solution (which corresponds to the other eigenvalue). So since you have x2=-ix1 √(L/C), the other eigenvector is x2=ix1 √(L/C).

4. Aug 20, 2010

### valjok

zomgineer,

How the λx = Ax that I did is different from (A-λI)x = 0? The x will cancel out in the same manner:

$$\left[\begin{array}{cc}\lambda&1/L\\ -1/C&\lambda\end{array}\right] \left[\begin{array}{cc}x_1\\x_2\end{array}\right] = 0$$

$$\left\{\begin{array}{ll}\lambda\ x_1 + x_2 / L = 0 (1)\\ -x_1/C + \lambda x_2 = 0 (2)\end{array}$$

You've seen what follows: x1 = Cλx2 (from second equation) and, substituting into the first, Cλ²x2 = -x2/L or λ² = -1/LC. I can now find the eigenvalues again, as expected. If I use one of them instead of λ, the tautology, 1=1, will result again. I'm going loops. It is incredible how people manage to find both x and λ from a single equation!

Secondly, I do not understand the need for highlighting the homogeneity in the case of zero-input system, which cannot have other solutions, IMO. Didn't you miss the door?

JT,

the tautology seems to be not because of complex conjugate, rather, because one of the vector components is a free variable. In your examples, it is x1.

Indeed, [λI-A] must be singular, that is, its rank < 2, because its determinant is zero by definition. Thus, at least one component of the vector must be free! I have checked, both equations, (1) and (2), provided λ = i/√LC, result in same solution for λ1: x2=ix1 √(L/C). It is true for λ2 as well. The tautology is because of this equation redundancy.

Notably, all the received responses do not point this key fact out. Not even that I make a mistake when state that it means exactly n eigenvectors. My book only says that all-eigenvalues-different case makes this method of denationalization possible. It also especially points out that a whole line of vectors is possible for a given eigenvalue. That is, if X is an eigenvector, then cX is too. The free component is chosen arbitrarly.

So, the eigenvectors are

$$x1\left[\begin{array}{cc}1 \\ -i\sqrt{L/C} \end{array}\right]$$
and
$$x1\left[\begin{array}{cc}1 \\ i\sqrt{L/C} \end{array}\right],$$
where it is reasonable to fix x1=1.

5. Aug 20, 2010

### valjok

Another curious point is that the resulting diagonalized system looks more complex than the anti-diagonal. Let me first produce the forced LC-circuit. I add the input Uin through the input matrix B to the state equation, x' = Ax + Buin:

$$\left[\begin{array}{cc}\dot{i}\\ \dot{u} \end{array}\right] = \left[\begin{array}{cc}0&-1/L\\ 1/C&0\end{array}\right] \left[\begin{array}{cc}{i}\\ {u}\end{array}\right] + \left[\begin{array}{cc}1/L\\ 0\end{array}\right] u_{in} \end{array}$$

and output function, Y = Сx, to the system: Y = [0 1] X. That is, the state variable u is the output. This corresponds to the schematic (the input Uin was zero in my request)

Uin voltage is applied to LC and capacitor voltage is measured. Here is the graph

Now, collecting the eigenvectors into a matrix T,

$$T = \left[\begin{array}{cc}i \sqrt{C\over L} & 1\\ 1 & i \sqrt{L \over C} \end{array}\right]$$ and its inverse, $$T^{-1} = -1/2 \left[\begin{array}{cc}i \sqrt{L\over C} & -1\\ -1 & i \sqrt{C \over L} \end{array}\right]$$

we can build the diagonalized system upon the natural state vector, Z(t) = T-1 X(t). The general form is

Z'(t) = T-1 A T Z(t)+ T-1BUin(t) = ΛZ(t) + ßUin(t)
Y(y) = CTZ(t) = ĈZ(t).

and, in case of examined LC:

$$\left[\begin{array}{cc}\dot{z_1} \\ \dot{z_2} \end{array}\right] = \left[\begin{array}{cc}i / \sqrt{LC} & 0\\ 0 & -i / \sqrt{LC} \end{array}\right] + \left[\begin{array}{cc}i / \sqrt{LC} \\ -1/L \end{array}\right] u_{in} \end{array}\right] u = \left[\begin{array}{cc}0 & 1 \end{array}\right] T Z = \left[\begin{array}{cc}1 & i\sqrt{L\over C}\end{array}\right] Z,$$

which can be represeted by a graph:

You see,
1. there are more links (one connects input to the second state variable and another adds the second state to the output) and
which is why the simple diagonal system looks much more complex!

6. Aug 21, 2010

### valjok

JT, excuse me for overlooking your first point. The tautology because of "the same set of eigenvectors" is exactly the answer to my trouble. If you could understand my reply, I have realized what you wanted to say and called it "a line of vectors". Now I understand that from Ax = λx follows that A(cx) = cAx = cλx = λ(cx), which means that operator scales both eigenvector X any vector on the same line, cX, by the same factor. For this reason, cX can also serve as an eigenvector. That is why the line or set of eigenvectors. This explanation is closer to the definition of eigenvector than the singularity of λI-A in solving [λI-A]x = 0.

7. Aug 21, 2010

### valjok

I have got to know that set or line of eigenvectors is called an eigenline and λ is its scaling factor.

Last edited: Aug 21, 2010
8. Aug 22, 2010

### HallsofIvy

Staff Emeritus
In specific applications, that is true.

We can think of a matrix, applied to a vector in $R^2$ or $R^3$ as mapping the point p (thinking of the vector as having it "base" at the origin and its point at p) to another point p'. Yes, if p is an eigenvector, then $Ap= \lambda p$. Further, any point on the line through the origin and p can be written as $\mu p$ for some number $\mu$. Then $A(\mu p)= \mu Ap= \mu\lambda p$, again a point on that line. That is, lines in the directions of the eigenvectors are mapped to themselves, "stretched" by a factor of [math]\lambda[/itex] ("compressed" by that factor if $0< \lambda< 1$. (If $\lambda< 0$ the line is also "swapped end for end". If $\lambda= 0$ the entire line is collapsed down to the origin.)
is mapped again to that line.

9. Aug 23, 2010

### valjok

> In specific applications, that is true.

Wikipedia says that the concept of eigenline is true for all systems. How does your elaboration, that repeats mine of Aug21-10 10:09 AM, demarcate the "specific applications" from the most cases where "it is not true"?

10. Sep 4, 2010

### valjok

it is a sort of Rotation

Accidetntally, I have noticed that the original system

$$\left[\begin{array}{cc}0&-1/L\\ 1/C&0\end{array}\right]$$

is a 90° rotation operator if L=C=1. It translates [x,y] into [-y,x]:

$$\left[\begin{array}{cc}0&-1\\ 1&0\end{array}\right] \left[\begin{array}{c}x\\y\end{array}\right] = \left[\begin{array}{c}-y\\x\end{array}\right]$$

The eigenvector and diagonal matrices are

$$\left[\begin{array}{cc}i&1\\1&i\end{array}\right]$$ and $$\left[\begin{array}{cc}i&0\\0&-i\end{array}\right]$$

I do not know what this means. I just see that the corresponding discrete system cylces around 4 stages:

$$\left[\begin{array}{c}i\\u\end{array}\right] = \left[\begin{array}{c}i\\1\end{array}\right] \rightarrow \left[\begin{array}{c}-1\\-i\end{array}\right] \rightarrow \left[\begin{array}{c}-i\\-1\end{array}\right] \rightarrow \left[\begin{array}{c}1\\i\end{array}\right] \rightarrow \left[\begin{array}{c}i\\1\end{array}\right]$$

that correspond to all energy concentrated in 1) inductor node, 2) capacitor, 3) inductor current flowing in opposite direction and 4) capacitor charged oppositely. The imagenray 'i' state corresponds to [value 0 + conterpart (speed of change) 1]. That is, imaginary part duplicates the state of counterpart variable, which is necessary for diagonal system that decouples accumulations from each other.