Problem 2.
and the normalized state in
.
Find the left and right-hand side of the entropi inequality.
Solution.
The entropic inequality is given by:
where
In this case we are interested in the Shannon information entropy associated with the hermitian matrices. This requires determining the spectral decomposition, i.e., eigenvectors. Moreover we actuall what to calculate the entropic uncertainty relation for the two hermitian matrices acting on this Hilbert space.
xxxxxxxxxx
begin
using SymPy
end
xxxxxxxxxx
begin
𝑖 = sympy.I
⋅(x,y) = x*y
⋅(x::Array{Sym,2},y) = x.dot(y)
norm(x,y) = begin
braket = x.adjoint()⋅x
sympy.sqrt(braket[1])
end
end;
xxxxxxxxxx
begin
K₁ = [0 0 -𝑖;
0 0 0;
𝑖 0 0];
K₂ = sympy.Matrix([1 0 0;
0 0 0;
0 0 -1]);
ψ = 1/sympy.sqrt(3) ⋅ [1;
1;
1];
end;
The Shannon information entropy for the hermitian matrix,
where the probability
with
Therefore we first need to find the eigenspectrum for the matrices
xxxxxxxxxx
begin
𝐞₁ = K₁.eigenvects()
𝐞₂ = K₂.eigenvects()
end;
Now normalize the eigenvectors since we want the probabilities:
xxxxxxxxxx
begin
# There are 3 eigenvectors
𝐯₁ = Array{Array{Sym}}(undef,3)
𝐯₂ = Array{Array{Sym}}(undef,3)
for i=1:3
ν₁,ν₂ = 𝐞₁[i][3][1],𝐞₂[i][3][1]
n₁,n₂ = norm(ν₁,ν₁),norm(ν₂,ν₂)
𝐯₁[i] = ν₁/n₁
𝐯₂[i] = ν₂/n₂
end
end
Calculating the Shannon entropy for each hermitian matrice:
xxxxxxxxxx
begin
S₁ = 0.00
S₂ = 0.00
for i=1:3
p¹ᵢ = sympy.Abs(ψ.adjoint()⋅𝐯₁[i])^2
p²ᵢ = sympy.Abs(ψ.adjoint()⋅𝐯₁[i])^2
S₁ -= p¹ᵢ ⋅ sympy.log(p¹ᵢ)
S₂ -= p²ᵢ ⋅ sympy.log(p²ᵢ)
end
end
Now we want to calculate the terms on the right and left sides of the following inequality:
The left hand side is then just:
Calculating the right hand side:
xxxxxxxxxx
rhs = -2⋅sympy.log(maximum([sympy.Abs(𝐯₁[i].adjoint()⋅𝐯₂[j]) for i=1:3,j=1:3]));
The right hand side of the inequality makes sense since both