Linear algebra application: entropy

Click For Summary

Homework Help Overview

The discussion revolves around the application of linear algebra to a problem involving a linear chain of N atoms, each of which can exist in one of three states (A, B, C) with a specific constraint that atoms in state A cannot be adjacent to those in state C. Participants are tasked with finding the entropy per atom as N approaches infinity.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the relationship between the number of configurations of atom chains and the matrix representation of these configurations. They discuss how to generalize specific cases to a broader formulation using linear equations. Questions arise regarding the implications of the eigenvalues and the definition of entropy in this context.

Discussion Status

There is an ongoing exploration of the relationships between the configurations and the matrix representation, with some participants providing insights into the eigenvalues and their significance. However, there is no explicit consensus on the implications of the eigenvalues for the entropy calculation, particularly concerning the factor affecting the total number of configurations.

Contextual Notes

Participants note that the sum of the components of the configuration vector does not directly equate to the eigenvalue, raising questions about the impact of this discrepancy for large N. The discussion also reflects on the constraints imposed by the problem's setup and the definitions being used.

buffordboy23
Messages
545
Reaction score
2

Homework Statement



Consider a linear chain of N atoms. Each atom can be in 3 states (A,B,C) but an atom is state A cannot be next to an atom in state C. Find the entropy per atom as N approaches infinity.

Accomplish this by defining the 3-vector [tex]\vec{v}^{j}[/tex] to be the number of allowed configurations of the j-atom chain ending in type A, B, C. Then show that [tex]\vec{v}^{j} = \textbf{M}\vec{v}^{j-1}[/tex]. Then [tex]\vec{v}^{j} = \textbf{M}^{j-1}\vec{v}^{1}[/tex]. Show that in the limit of large N, the entropy per atom is dominated by the largest eigenvalue of M, and is given by [tex]k ln(1 + \sqrt{2})[/tex].

The Attempt at a Solution



For the first j-atom chains, it is evident that

[tex]\vec{v}^{1} = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}[/tex], [tex]\vec{v}^{2} = \begin{bmatrix} 2 \\ 3 \\ 2 \end{bmatrix}[/tex], [tex]\vec{v}^{3} = \begin{bmatrix} 5 \\ 7 \\ 5 \end{bmatrix}[/tex]

which implies that

[tex]\textbf{M} = \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix}[/tex]

Right now I am having trouble with the first part: show that [tex]\vec{v}^{j} = \textbf{M}\vec{v}^{j-1}[/tex]. It is easy to show for specific cases using the vectors I have determined above, but I am confused on how to generalize this relation.
 
Physics news on Phys.org
Ok, let v(n)_A, v(n)_B and v(n)_C be number of n atom chains ending in A, B and C (the three components of your column vectors). Then to get v(n+1)_A you take any n atom chain ending in A or B (not C) and add an A. So v(n+1)_A=v(n)_A+v(n)_B. Now do v(n+1)_B and v(n+1)_C. Aren't those linear equations the same as v(n+1)=Mv(n)?
 
Okay, so let

[tex]\vec{v}^{j-1} = \begin{bmatrix} v^{j-1}_{A} \\ v^{j-1}_{B} \\ v^{j-1}_{C} \end{bmatrix}[/tex]

where [tex]v^{j-1}_{A}[/tex] is the number of configurations that in end in A for j-1 atoms, and the same for B and C. Then,

[tex]\vec{v}^{j} = \textbf{M}\vec{v}^{j-1} = \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix}\begin{bmatrix} v^{j-1}_{A} \\ v^{j-1}_{B} \\ v^{j-1}_{C} \end{bmatrix}= \begin{bmatrix} v^{j-1}_{A} + v^{j-1}_{B} \\ v^{j-1}_{A} + v^{j-1}_{B} + v^{j-1}_{C} \\ v^{j-1}_{B} + v^{j-1}_{C} \end{bmatrix}[/tex]
 
Sure. Doesn't that express the condition "A cannot be next to C"?
 
Thanks for helping me out of my stupor. That was ridiculously easy. I found the eigenvalues 1, 1+sqrt(2), and 1-sqrt(2). I used these to construct the diagonal matrix D given by

[tex]\textbf{D} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 + \sqrt{2} & 0 \\ 0 & 0 & 1 - \sqrt{2} \end{bmatrix}[/tex]

Then I used the basis vectors of the eigenspace to construct the orthogonal matrix Q:

[tex]\textbf{Q} = \begin{bmatrix} \frac{1}{\sqrt{2}} & \frac{1}{2} & \frac{1}{2} \\ 0 & \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}} & \frac{1}{2} & \frac{1}{2} \end{bmatrix}[/tex]

So,

[tex]\textbf{M}^{N} = \textbf{Q}^{T}\textbf{D}^{N}\textbf{Q}[/tex]

For large N, I can ignore the eigenvalues 1 and 1 - sqrt{2}, and construct [tex]\textbf{M}^{N}[/tex] from the large eigenvalue 1 + sqrt{2}, which shows that the entropy per atom is dominated by the largest eigenvalue, right? But how does this show that [tex]W \approx 1 + \sqrt{2}[/tex]?
 
What's your definition of entropy for this system?
 
S = k ln W

where k is boltzmann's constant.
 
Ok. What's W?
 
It's the total number of configurations, which is the sum of the components of the vector [tex]\vec{v}^{j}[/tex]. But the sum of these components does not necessarily equal the eigenvalue. If I recall from earlier, it's off exactly by a factor of 3/2. Does this really matter for large N? This factor results in an overall difference of about 30% for the value the logarithm term compared with the given approximation.
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
19
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K