Linear algebra application: entropy

In summary: This deviation can be attributed to the fact that the approximation only considers the largest eigenvalue, while the full calculation accounts for all three eigenvalues. In summary, we are tasked with finding the entropy per atom for a linear chain of N atoms with 3 possible states, where an atom in state A cannot be next to an atom in state C. This is accomplished by defining a 3-vector \vec{v}^{j} to represent the number of allowed configurations for a j-atom chain ending in types A, B, and C. Using this vector, we can show that \vec{v}^{j} = \textbf{M}\vec{v}^{j-1} and in the limit of
  • #1
buffordboy23
548
2

Homework Statement



Consider a linear chain of N atoms. Each atom can be in 3 states (A,B,C) but an atom is state A cannot be next to an atom in state C. Find the entropy per atom as N approaches infinity.

Accomplish this by defining the 3-vector [tex] \vec{v}^{j} [/tex] to be the number of allowed configurations of the j-atom chain ending in type A, B, C. Then show that [tex] \vec{v}^{j} = \textbf{M}\vec{v}^{j-1}[/tex]. Then [tex] \vec{v}^{j} = \textbf{M}^{j-1}\vec{v}^{1}[/tex]. Show that in the limit of large N, the entropy per atom is dominated by the largest eigenvalue of M, and is given by [tex] k ln(1 + \sqrt{2})[/tex].

The Attempt at a Solution



For the first j-atom chains, it is evident that

[tex] \vec{v}^{1} = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} [/tex], [tex] \vec{v}^{2} = \begin{bmatrix} 2 \\ 3 \\ 2 \end{bmatrix} [/tex], [tex] \vec{v}^{3} = \begin{bmatrix} 5 \\ 7 \\ 5 \end{bmatrix} [/tex]

which implies that

[tex] \textbf{M} = \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix} [/tex]

Right now I am having trouble with the first part: show that [tex] \vec{v}^{j} = \textbf{M}\vec{v}^{j-1}[/tex]. It is easy to show for specific cases using the vectors I have determined above, but I am confused on how to generalize this relation.
 
Physics news on Phys.org
  • #2
Ok, let v(n)_A, v(n)_B and v(n)_C be number of n atom chains ending in A, B and C (the three components of your column vectors). Then to get v(n+1)_A you take any n atom chain ending in A or B (not C) and add an A. So v(n+1)_A=v(n)_A+v(n)_B. Now do v(n+1)_B and v(n+1)_C. Aren't those linear equations the same as v(n+1)=Mv(n)?
 
  • #3
Okay, so let

[tex] \vec{v}^{j-1} = \begin{bmatrix} v^{j-1}_{A} \\ v^{j-1}_{B} \\ v^{j-1}_{C} \end{bmatrix} [/tex]

where [tex] v^{j-1}_{A} [/tex] is the number of configurations that in end in A for j-1 atoms, and the same for B and C. Then,

[tex] \vec{v}^{j} = \textbf{M}\vec{v}^{j-1} = \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix}\begin{bmatrix} v^{j-1}_{A} \\ v^{j-1}_{B} \\ v^{j-1}_{C} \end{bmatrix}= \begin{bmatrix} v^{j-1}_{A} + v^{j-1}_{B} \\ v^{j-1}_{A} + v^{j-1}_{B} + v^{j-1}_{C} \\ v^{j-1}_{B} + v^{j-1}_{C} \end{bmatrix} [/tex]
 
  • #4
Sure. Doesn't that express the condition "A cannot be next to C"?
 
  • #5
Thanks for helping me out of my stupor. That was ridiculously easy. I found the eigenvalues 1, 1+sqrt(2), and 1-sqrt(2). I used these to construct the diagonal matrix D given by

[tex] \textbf{D} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 + \sqrt{2} & 0 \\ 0 & 0 & 1 - \sqrt{2} \end{bmatrix} [/tex]

Then I used the basis vectors of the eigenspace to construct the orthogonal matrix Q:

[tex] \textbf{Q} = \begin{bmatrix} \frac{1}{\sqrt{2}} & \frac{1}{2} & \frac{1}{2} \\ 0 & \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}} & \frac{1}{2} & \frac{1}{2} \end{bmatrix} [/tex]

So,

[tex] \textbf{M}^{N} = \textbf{Q}^{T}\textbf{D}^{N}\textbf{Q} [/tex]

For large N, I can ignore the eigenvalues 1 and 1 - sqrt{2}, and construct [tex] \textbf{M}^{N} [/tex] from the large eigenvalue 1 + sqrt{2}, which shows that the entropy per atom is dominated by the largest eigenvalue, right? But how does this show that [tex] W \approx 1 + \sqrt{2} [/tex]?
 
  • #6
What's your definition of entropy for this system?
 
  • #7
S = k ln W

where k is boltzmann's constant.
 
  • #8
Ok. What's W?
 
  • #9
It's the total number of configurations, which is the sum of the components of the vector [tex] \vec{v}^{j} [/tex]. But the sum of these components does not necessarily equal the eigenvalue. If I recall from earlier, it's off exactly by a factor of 3/2. Does this really matter for large N? This factor results in an overall difference of about 30% for the value the logarithm term compared with the given approximation.
 
Last edited:

1. What is entropy and how is it related to linear algebra?

Entropy is a measure of disorder or randomness in a system. In linear algebra, entropy is often used as a way to quantify the uncertainty or information content of a set of variables or data points.

2. How is entropy calculated using linear algebra?

Entropy can be calculated using linear algebra by first constructing a probability matrix from a set of data points. This probability matrix can then be used to calculate the entropy of the system using the Shannon entropy formula.

3. What are some common applications of linear algebra in calculating entropy?

Linear algebra is commonly used in fields such as statistics, machine learning, and data analysis to calculate entropy. It is often used to analyze and compare the uncertainty or information content of different datasets or variables.

4. How does the concept of entropy relate to information theory?

Entropy is a fundamental concept in information theory, which studies the quantification, storage, and communication of information. Entropy is used to measure the amount of uncertainty or randomness in a system, and is closely related to the amount of information that can be gained from that system.

5. Can linear algebra be used to optimize information storage or communication systems?

Yes, linear algebra can be used to optimize information storage or communication systems by analyzing the entropy of the system and finding the most efficient way to store or transmit information. This can be especially useful in fields such as data compression and error correction.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
960
  • Calculus and Beyond Homework Help
Replies
2
Views
982
  • Calculus and Beyond Homework Help
Replies
3
Views
569
  • Calculus and Beyond Homework Help
Replies
14
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
13
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
Back
Top