Problem 2.

K1=(00i000i00)K2=(100000001)

and the normalized state in C3,

|ψ=13(111)

.

Find the left and right-hand side of the entropi inequality.

12.9 μs

Solution.

The entropic inequality is given by:

S(ρ)log(n)

where n is the dimensions of ρ.

In this case we are interested in the Shannon information entropy associated with the hermitian matrices. This requires determining the spectral decomposition, i.e., eigenvectors. Moreover we actuall what to calculate the entropic uncertainty relation for the two hermitian matrices acting on this Hilbert space.

13.0 μs
8.9 s
69.4 μs
1.1 ms

The Shannon information entropy for the hermitian matrix, O, is given by:

S|ψ(O)=i=1npi(O,|ψ)ln(pi(O,|ψ))

where the probability pi is obtained from:

pi(O,|ψ)=|ψ|αi|2

with |αi being the ith eigenvector.

Therefore we first need to find the eigenspectrum for the matrices K1 and K2"

25.9 μs
7.7 ms

Now normalize the eigenvectors since we want the probabilities: p(K,|ψ)=|ψ|α|2

9.8 μs
8.6 ms

Calculating the Shannon entropy for each hermitian matrice:

7.9 μs
8.2 ms

Now we want to calculate the terms on the right and left sides of the following inequality:

S|ψ(K1)+S|ψ(K2)2ln(max1i,jn|αi|βj|)

The left hand side is then just:

2log(3).

Calculating the right hand side:

359 μs
49.4 ms

2log(3)0

The right hand side of the inequality makes sense since both K1 and K2 share a common eigenvector:

(010)

462 μs