Relation between Mutual information and Expectation Values

  • #1
blueinfinity
1
0
Homework Statement
Alice and Bob share the Bell state
\begin{align*}
|\psi\rangle = \frac{1}{\sqrt{2}}(|00\rangle+|11\rangle).
\end{align*}
Consider the pair of observables
\begin{align*}
\mathcal{O}_A =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{2}
\end{pmatrix}
, \qquad \mathcal{O}_B =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{3}
\end{pmatrix}
.
\end{align*}
Show the mutual information between Alice and Bob is larger than $(\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle - \langle\psi |\mathcal{O}_A|\psi \rangle \langle\psi |\mathcal{O}_B|\psi \rangle)^2 $
Relevant Equations
\begin{align*}
|\psi\rangle = \frac{1}{\sqrt{2}}(|00\rangle+|11\rangle).
\end{align*}
Consider the pair of observables
\begin{align*}
\mathcal{O}_A =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{2}
\end{pmatrix}
, \qquad \mathcal{O}_B =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{3}
\end{pmatrix}
.
\end{align*}
I've make progress in obtaining the values for the mutual information using the following:
$I(\rho_A:\rho_B) = S(\rho_A) +S(\rho_B) - S(\rho_{AB}) = 1 + 1 - 0 = 2.$

I would like to compute the expectation but I'm facing a problem in the case of $\langle\psi |\mathcal{O}_A|\psi \rangle$ since the size of matrices in this multiplication do not match. namely, $\langle\psi$ is of size $1\times 4$ and $|\psi \rangle$ is of size $4\times 1$ and the matrix $\mathcal{O}_A$ is $2 \times 2$.
I'm very new to the subject and I would greatly appreciate if I could have some guidance on how the computations for this expectation would be carried out.

additionally I have computed the $\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle$ by first computing the tensor product of the two matrices $A,B$ and then taken the multiplication with the Bra and Ket of the state respectively deducing
$$\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle = \frac{7}{12}$$.

I would appreciate any insight on this.
 
Physics news on Phys.org
  • #2
:welcome:

You need two hashes as the delimiter for inline Latex.
 
  • #3
Here it is edited:
Show the mutual information between Alice and Bob is larger than ##(\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle - \langle\psi |\mathcal{O}_A|\psi \rangle \langle\psi |\mathcal{O}_B|\psi \rangle)^2 ##

I've make progress in obtaining the values for the mutual information using the following:
##I(\rho_A:\rho_B) = S(\rho_A) +S(\rho_B) - S(\rho_{AB}) = 1 + 1 - 0 = 2.##

I would like to compute the expectation but I'm facing a problem in the case of ##\langle\psi |\mathcal{O}_A|\psi \rangle## since the size of matrices in this multiplication do not match. namely, ##\langle\psi## is of size ##1\times 4## and ##|\psi \rangle## is of size ##4\times 1## and the matrix ##\mathcal{O}_A## is ##2 \times 2##.
I'm very new to the subject and I would greatly appreciate if I could have some guidance on how the computations for this expectation would be carried out.

additionally I have computed the ##\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle## by first computing the tensor product of the two matrices ##A,B## and then taken the multiplication with the Bra and Ket of the state respectively deducing...

I think that for ##\langle\psi |\mathcal{O}_A|\psi \rangle## you calculate ##\langle\psi | \mathcal{O}_A \otimes \mathcal I| \psi\rangle##, and for ##\langle\psi |\mathcal{O}_B|\psi \rangle##, it is ##\langle\psi |\mathcal I \otimes \mathcal{O}_B | \psi\rangle##.

(But I am new to it, too.)
 
Last edited:
  • Like
Likes blueinfinity

1. What is mutual information?

Mutual information is a measure of the amount of information shared between two random variables. It measures the degree of dependence between the two variables and is often used in information theory and statistical analysis.

2. How is mutual information related to expectation values?

Mutual information is related to expectation values through the concept of entropy. Entropy is a measure of uncertainty or randomness in a system, and it is related to the expectation value of the logarithm of the probability distribution of a random variable. Mutual information can be calculated using the expectation values of the joint probability distribution of two random variables and the individual probability distributions of each variable.

3. Can mutual information be negative?

Yes, mutual information can be negative. This occurs when the two random variables have a negative correlation, meaning that as one variable increases, the other decreases. In this case, the mutual information measures the amount of information that is lost when one variable is known, given the value of the other variable.

4. How is mutual information used in machine learning?

Mutual information is used in machine learning as a measure of feature relevance. It can help identify which features or variables are most important in predicting a target variable, and can be used for feature selection and dimensionality reduction.

5. What is the relationship between mutual information and conditional mutual information?

Conditional mutual information is a measure of the amount of information shared between two variables, given the value of a third variable. It is related to mutual information through the chain rule of information theory, which states that the mutual information between two variables is equal to the sum of the conditional mutual information between one variable and a third variable, and the conditional mutual information between the other variable and the same third variable.

Similar threads

  • Introductory Physics Homework Help
Replies
7
Views
1K
  • Quantum Physics
Replies
1
Views
936
  • Introductory Physics Homework Help
Replies
7
Views
2K
  • Quantum Physics
Replies
13
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
47
Views
1K
Replies
1
Views
604
  • Quantum Interpretations and Foundations
Replies
15
Views
2K
Replies
14
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
2K
Back
Top