Linear algebra application: entropy

buffordboy23
Messages
545
Reaction score
2

Homework Statement



Consider a linear chain of N atoms. Each atom can be in 3 states (A,B,C) but an atom is state A cannot be next to an atom in state C. Find the entropy per atom as N approaches infinity.

Accomplish this by defining the 3-vector \vec{v}^{j} to be the number of allowed configurations of the j-atom chain ending in type A, B, C. Then show that \vec{v}^{j} = \textbf{M}\vec{v}^{j-1}. Then \vec{v}^{j} = \textbf{M}^{j-1}\vec{v}^{1}. Show that in the limit of large N, the entropy per atom is dominated by the largest eigenvalue of M, and is given by k ln(1 + \sqrt{2}).

The Attempt at a Solution



For the first j-atom chains, it is evident that

\vec{v}^{1} = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}, \vec{v}^{2} = \begin{bmatrix} 2 \\ 3 \\ 2 \end{bmatrix}, \vec{v}^{3} = \begin{bmatrix} 5 \\ 7 \\ 5 \end{bmatrix}

which implies that

\textbf{M} = \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix}

Right now I am having trouble with the first part: show that \vec{v}^{j} = \textbf{M}\vec{v}^{j-1}. It is easy to show for specific cases using the vectors I have determined above, but I am confused on how to generalize this relation.
 
Physics news on Phys.org
Ok, let v(n)_A, v(n)_B and v(n)_C be number of n atom chains ending in A, B and C (the three components of your column vectors). Then to get v(n+1)_A you take any n atom chain ending in A or B (not C) and add an A. So v(n+1)_A=v(n)_A+v(n)_B. Now do v(n+1)_B and v(n+1)_C. Aren't those linear equations the same as v(n+1)=Mv(n)?
 
Okay, so let

\vec{v}^{j-1} = \begin{bmatrix} v^{j-1}_{A} \\ v^{j-1}_{B} \\ v^{j-1}_{C} \end{bmatrix}

where v^{j-1}_{A} is the number of configurations that in end in A for j-1 atoms, and the same for B and C. Then,

\vec{v}^{j} = \textbf{M}\vec{v}^{j-1} = \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix}\begin{bmatrix} v^{j-1}_{A} \\ v^{j-1}_{B} \\ v^{j-1}_{C} \end{bmatrix}= \begin{bmatrix} v^{j-1}_{A} + v^{j-1}_{B} \\ v^{j-1}_{A} + v^{j-1}_{B} + v^{j-1}_{C} \\ v^{j-1}_{B} + v^{j-1}_{C} \end{bmatrix}
 
Sure. Doesn't that express the condition "A cannot be next to C"?
 
Thanks for helping me out of my stupor. That was ridiculously easy. I found the eigenvalues 1, 1+sqrt(2), and 1-sqrt(2). I used these to construct the diagonal matrix D given by

\textbf{D} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 + \sqrt{2} & 0 \\ 0 & 0 & 1 - \sqrt{2} \end{bmatrix}

Then I used the basis vectors of the eigenspace to construct the orthogonal matrix Q:

\textbf{Q} = \begin{bmatrix} \frac{1}{\sqrt{2}} & \frac{1}{2} & \frac{1}{2} \\ 0 & \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}} & \frac{1}{2} & \frac{1}{2} \end{bmatrix}

So,

\textbf{M}^{N} = \textbf{Q}^{T}\textbf{D}^{N}\textbf{Q}

For large N, I can ignore the eigenvalues 1 and 1 - sqrt{2}, and construct \textbf{M}^{N} from the large eigenvalue 1 + sqrt{2}, which shows that the entropy per atom is dominated by the largest eigenvalue, right? But how does this show that W \approx 1 + \sqrt{2}?
 
What's your definition of entropy for this system?
 
S = k ln W

where k is boltzmann's constant.
 
Ok. What's W?
 
It's the total number of configurations, which is the sum of the components of the vector \vec{v}^{j}. But the sum of these components does not necessarily equal the eigenvalue. If I recall from earlier, it's off exactly by a factor of 3/2. Does this really matter for large N? This factor results in an overall difference of about 30% for the value the logarithm term compared with the given approximation.
 
Last edited:
Back
Top