Ising model : Exercises
Introduction
Example problems
Click on the problems to reveal the solution
Problem 1
We next recall that the trace of a product of matrices is invariant to cyclic permutations in the order in which the matrix products are calculated. This is is important as we can use this fact to rewrite the above as \[ \langle S_i \rangle = \frac{\textrm{Tr}(\sigma \mathbf{P}^N) }{Z_c} = \frac{\textrm{Tr}(\sigma \mathbf{P}^N) }{\textrm{Tr}(\mathbf{P}^N)} \] We get to the final result here by using the fact that the canonical partition function, $Z_c$, is equal to the trace of the transfer matrix, $\mathbf{P}$.
Problem 2
To find the partition function for the open, 1D-Ising model we also need to calculate the eigenvectors of this matrix. Remember that we calculate an eigenvector by rearranging the equation that defines our eigensystem of equations as follows: \[ \mathbf{P}\mathbf{v} = \lambda \mathbf{v} \qquad \rightarrow \qquad \left( \mathbf{P} - \lambda \mathbf{I} \right) \mathbf{v} = 0 \] Lets now write out the matrices explicitly and find the first eigenvector (the one with eigenvalue $2\cosh(\beta J)=e^{\beta J} + e^{-\beta J}$). \[ \begin{aligned} 0= \left( \mathbf{P} - \lambda \mathbf{I} \right) \mathbf{v} & = \left\{ \left( \begin{matrix} e^{\beta J} & e^{-\beta J} \\ e^{-\beta J} & e^{\beta J} \end{matrix} \right) - \left( e^{\beta J} + e^{-\beta J} \right) \left( \begin{matrix} 1 & 0 \\ 0 & 1 \end{matrix} \right) \right\} \left( \begin{matrix} v_1 \\ v_2 \end{matrix} \right) \\ & = \left( \begin{matrix} -e^{-\beta J} & e^{-\beta J} \\ e^{-\beta J} & -e^{-\beta J} \end{matrix} \right) \left( \begin{matrix} v_1 \\ v_2 \end{matrix} \right) =0 \end{aligned} \] Multiplying the top row of the matrix by the vector gives us $-e^{-\beta J}v_1 + e^{-\beta J}v_2 = 0$ which we can rearrange to give $v_1=v_2$. Now to get the final values of the components of the eigenvector we require that the vector $\mathbf{v}$ be normalised. In other words we require $v_1^2 + v_2^2 = 1$. We can rewrite this requirement as: $2v_1^2 =1$ as we know that $v_1=v_2$ and hence arrive at $v_1=v_2 = \frac{1}{\sqrt{2}}$.
We now turn to the second eigenvalue and eigenvector. By a similar logic to before we have: \[ \begin{aligned} 0= \left( \mathbf{P} - \lambda \mathbf{I} \right) \mathbf{v} & = \left\{ \left( \begin{matrix} e^{\beta J} & e^{-\beta J} \\ e^{-\beta J} & e^{\beta J} \end{matrix} \right) - \left( e^{\beta J} - e^{-\beta J} \right) \left( \begin{matrix} 1 & 0 \\ 0 & 1 \end{matrix} \right) \right\} \left( \begin{matrix} v_1 \\ v_2 \end{matrix} \right) \\ & = \left( \begin{matrix} e^{-\beta J} & e^{-\beta J} \\ e^{-\beta J} & e^{-\beta J} \end{matrix} \right) \left( \begin{matrix} v_1 \\ v_2 \end{matrix} \right) =0 \end{aligned} \] which brings us neatly to $e^{-\beta J}v_1 + e^{-\beta J}v_2 = 0$ and hence $v_2=-v_1$. The requirement of normalisation once again gives $2v_1^2 =1$ and hence $v_1=\frac{1}{\sqrt{2}}$ while $v_2 = -\frac{1}{\sqrt{2}}$
Lets now turn again to the partition function. When we left off we had: \[ Z_c = \mathbf{s}^T \mathbf{P}^{N-1}\mathbf{s} \] We can exploit the fact that $\mathbf{P}$ is diagonazable and rewrite this as: \[ Z_c = \mathbf{s}^T \mathbf{V} \mathbf{\Lambda}^{N-1} \mathbf{V}^{-1} \mathbf{s} \] The matrix $\mathbf{V}$ is the matrix that has the eigenvectors of $\mathbf{P}$ in its columns. In other words it is: \[ \mathbf{V} = \left( \begin{matrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \end{matrix} \right) \] Lets invert this matrix (remember we do this by dividing the adjoint matrix by the determinant as below): \[ \mathbf{V}^{-1} = \frac{1}{-\frac{1}{2}-\frac{1}{2}} \left( \begin{matrix} -\frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \end{matrix} \right) = \left( \begin{matrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \end{matrix} \right) \] What do you know $\mathbf{V}^{-1}=\mathbf{V}$. We could have worked this out by recalling something about symmetric matrices; namely, that we can decompose a symmetric diagonalisable matrix as: \[ \mathbf{P} = \mathbf{V} \mathbf{\Lambda} \mathbf{V}^T \] Here $\Lambda$ is still the diagonal element containing the eigenvalues and $\mathbf{V}$ is the matrix containing the eigenvectors.
Right so the matrix multiplication we have to do is as follows: \[ \begin{aligned} Z_c & = \left( \begin{matrix} 1 & 1 \end{matrix} \right) \left( \begin{matrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \end{matrix} \right) \left( \begin{matrix} \lambda_1^{N-1} & 0 \\ 0 & \lambda_2^{N-1} \end{matrix} \right) \left( \begin{matrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \end{matrix} \right) \left( \begin{matrix} 1 \\ 1 \end{matrix} \right) \\ & = \left( \begin{matrix} \sqrt{2} & 0 \end{matrix} \right) \left( \begin{matrix} \lambda_1^{N-1} & 0 \\ 0 & \lambda_2^{N-1} \end{matrix} \right) \left( \begin{matrix} \sqrt{2} \\ 0 \end{matrix} \right) \\ & = \left( \begin{matrix} \sqrt{2} & 0 \\ \end{matrix} \right) \left( \begin{matrix} \sqrt{2}\lambda_1^{N-1} \\ 0 \end{matrix} \right) = 2\lambda_1^{N-1} = 2^N\cosh^{N-1}(\beta J) \end{aligned} \] Notice that in the first line of the above we have used $H=0$ for the $\mathbf{s}$ vectors that were introduced in part (b). In the final part I have inserted in the value of the first eigenvalue.
Problem 3
We now note that if $\nu=0$ we must have: \[ \left(e^{\beta J} - 1 - \lambda \right) = 0 \qquad \rightarrow \qquad \lambda = e^{\beta J} - 1 \] This will be the eigenvalue for any eigenvector that has $\nu=0$. $\lambda_1$ will be the eigenvalue corresponding to any eigenvector that has $\nu\ne0$.
To confirm these arguments are correct consider that the trace of the original matrix $\mathbf{P}$ is equal to: \[ \textrm{Tr}(\mathbf{P}) = pe^{\beta J} \] which is equal to: \[ \lambda_1 + (p-1)\lambda = (e^{\beta J} - 1 + p ) + p(e^{\beta J} - 1) = 6e^{\beta J} + p -p = 6e^{\beta J} \] and as we now know the trace of a matrix is equal to the sum of its eigenvalues.