Equations are not being displayed properly on some articles. We hope to have this fixed soon. Our apologies.

Sparavigna, A. (2016). A Short Note on Quantum Entropies. PHILICA.COM Article number 558.

ISSN 1751-3030  
Log in  
  1309 Articles and Observations available | Content last updated 18 January, 14:23  
Philica entries accessed 3 569 004 times  

NEWS: The SOAP Project, in collaboration with CERN, are conducting a survey on open-access publishing. Please take a moment to give them your views

Submit an Article or Observation

We aim to suit all browsers, but recommend Firefox particularly:

A Short Note on Quantum Entropies

Amelia Carolina Sparavignaunconfirmed user (Department of Applied Science and Technology, Politecnico di Torino)

Published in physic.philica.com

In quantum statistical mechanics, the extension of the classical Gibbs entropy is the von Neumann entropy, obtained from a quantum-mechanical system described by means of its density matrix. Here we shortly discuss this entropy and the use of generalized entropies instead of it.

Article body




A Short Note on Quantum Entropies

Amelia Carolina Sparavigna

Department of Applied Science and Technology, Politecnico di Torino, Torino, Italy


Abstract: In quantum statistical mechanics, the extension of the classical Gibbs entropy is the von Neumann entropy, obtained from a quantum-mechanical system described by means of its density matrix. Here we shortly discuss this entropy and the use of generalized entropies instead of it.


Keywords: Entropy, Quantum entropy, von Neumann entropy, Non-additive entropy, Tsallis  entropy, Kaniadakis entropy.


Entropy is involved in classical theory of information in the form of Shannon entropy. However, a generalized entropy, such as  Tsallis entropy or other entropies, can be used to measure information too.  For instance, mutual information is usually obtained from Shannon entropy, but it can be estimated also from Tsallis or Kaniadakis entropy [1,2]. In the case of systems obeying the quantum mechanics, information and changes of information are measured by von Neumann entropy as S(ρ) = −Tr (ρ lnρ), where the density matrix ρ is involved. If  we express this matrix is a diagonal basis, given by eigenvectors |i> and eigenvalues ηi, the density matrix becomes ρ = Σi ηi |i><i|, and  von Neumann entropy becomes S = −Σi i lnηi), where the summation runs across the range of eigenvalues.

Of course, the entropy measures of the classical information theory can be generalized to the quantum case to have mutual quantum information, joint entropies, conditional quantum entropies and so on. Then, given a bipartite quantum state described by the density matrix ρAB, the entropy of the joint system (A,B) is S(ρAB)  and entropies of the subsystems are S(ρA) , S(ρB)  respectively.  By analogy with the classical conditional entropy, one defines the conditional quantum entropy as S(ρA|B) = S(ρAB)S(ρB), where we have an entropy containing  the conditional density operator ρA|B  [3-6].

Let us concentrate on the mutual information obtained from the mutual entropy of a bipartite system. The definition of a quantum mutual entropy is modulated on the classical case too. In classical terms, given two subsystems A and B, and their probability distribution of two variables p(a,b), the two marginal distributions are given by: p(a) = Σb p(a,b); p(b) = Σa p(a,b).

The classical mutual information I(A;B) is defined by:

(1)         I(A;B) = S(p(a))+S(p(b))−S(p(a,b)) = S(A)+S(B)−S(A,B),

In (1), S denotes the Shannon entropy. Note that it is giving information on the dependence or independence of subsets A and B. In the case of independent subsets the mutual information is zero. For a generalized entropy, the mutual information is given by a different equation according to its specific generalized additivity [1,2].

The mutual information can also be described as a relative entropy between p(a,b) and p(a)p(b) [6]. It follows from the property of relative entropy that I(A;B) ≥ 0 and equality holds if and only if p(a,b) = p(a)p(b), which is the condition of independency of the subsets A and B.

The quantum mechanical counterpart is obtained using von Neumann entropy. Then entropy S(A,B) becomes S(ρAB) = −Tr(ρAB lnρAB). From the probability distribution p(a,b), the marginal distributions are obtained. Instead of a simple sum, here we have a partial trace. So one can assign to ρ a state on the subsystem A by ρA=TrB (ρAB), where TrB  is the partial trace with respect to system B. After, entropy S(ρA) is calculated. S(ρB) is defined in the same manner [6].

An appropriate definition of quantum mutual information should be I(ρAB) = S(ρA)+S(ρB)−S(ρAB). S(ρ) is additive for independent systems. Given two density matrices ρA , ρB describing independent systems, we have:

(2)         S(ρAB) = S(ρA ρB) = S(ρA)+S(ρB), and I(ρAB) = 0.

Moreover, I(ρAB) ≤ 2 min[S(ρA),S(ρB)] for a corollary of Araki-Leib theorem [5]. This implies that quantum systems can be supercorrelated.

Here some examples [7]. Let us consider independent particles so that: ρA = ρB = ½( |0><0|+|1><1| ), we have that  ρAB = ρA ρB . S(A) = S(B) = −½log2½ −½log2½ = 1, S(A,B)=2, I(A;B)=0 and S(A|B)=1. Let us consider fully correlated particles:  ρAB = ½ ( |00><00|+|11><11| ). The density matrix for A is given by: ρA = TrB AB) = ½ ( |0><0|+|1><1| ). S(A) =1 =S(A,B) and S(A;B)=1 and S(A|B)=0. And we can have also entangled particles: |ψAB> =1/√2 (|00>+|11>). The density matrix is given by: ρAB = | ψAB >< ψAB |. We can calculate the density matrix of A: ρA = TrBAB) = ½ ( |0><0|+|1><1| ). Therefore S(A)=1, S(A,B) =0, S(A;B)=2 and S(A|B)=−1 [7].

After these examples with von Neumann entropy, let us consider the case of the generalized Tsallis entropy [8]. In this paper we find the quantum Tsallis entropy. It is given by:

(3)          Sq(ρ)=(Tr ρq −1)/(1−q).

Let us remember that Tsallis entropy is non-additive, that is, for independent subsystems the joint entropy is different from the sum of the entropies. As a consequence, in the quantum case, for product states [8]:

(4)        SqAB) =SqA ρB) =SqA)+SqB)+(1−q) SqA)SqB).

And, also in the case of commutating operators, a correlation is induced by this non additivity [7]. Such a correlation disappers when q→1. Quantum Tsallis entropy had been also discussed in [9].

What can we obtain, if we use the Kaniadakis entropy? It is an entropy having the following generalized additivity for independent systems Sκ(A,B) = Sκ(A)Yκ(B)+Sκ(B)Yκ(A), where Sκ = (Σipi1−κ – Σipi1+κ)/(2κ); Yκ = (Σipi1−κipi1+κ)/2 (see [10] for a discussion of the generalized  additivity of Tsallis and Kaniadakis entropies, and the references therein).  The summations run across i-states of the system with probabilities pi.  In [11], the quantum Kaniadakis entropy is given in the framework of a generalized additivity involving only entropies S, not function Y, because the authors used a formula which is valid in the case of equiprobable states. In a more general case then, let us stress that we have to add a quantum Y function:

(5)        Sκ(ρ)=(Tr ρ1+κ –Tr ρ1−κ)/(2κ)  ;  Yκ(ρ)=(Tr ρ1+κ +Tr ρ1−κ)/2.

And therefore, besides the quantum entropy, we have also to consider another quantity, which is the quantum version of function Y, fundamental for any discussion of conditional and mutual information as obtained in the Kaniadakis formalism [2].



[1] Sparavigna, A.C. (2015). Mutual Information and Nonadditive Entropies: The Case of Tsallis Entropy, International Journal of Sciences 4(10):1-4. DOI: 10.18483/ijSci.845

[2] Sparavigna, A.C. (2015). Relations Between Tsallis and Kaniadakis Entropic Measures and Rigorous Discussion of Conditional Kaniadakis Entropy, International Journal of Sciences 4(10):47-50. DOI: 10.18483/ijSci.866

[3] Cerf, N.J., & Adami, C. (1999).  Quantum extension of conditional probability, Physical Review A 60(2):893-897. DOI: 10.1103/physreva.60.893

[4] Cerf, N.J., & Adami, C. (1997). Negative Entropy and Information in Quantum Mechanics, Physical Review Letters 79(26):5194-5197. DOI: 10.1103/physrevlett.79.5194

[5] Jaeger, G. (2007). Quantum information: an Overview, Springer: New York. ISBN-13: 978-0387357256, ISBN-10: 0387357254

[6] Vv. Aa. (2015). Quantum Mutual Information, Wikipedia.

[7] Bastani, P. (unknown year). Quantum information Theory, Entanglement and measurement, CMPT881. Retrieved on 11/01/2016 from  http://www.sfu.ca/~pbastani/cmpt881.pdf

[8] Abe, S., & Rajagopal, A.K. (1999). Quantum Entanglement Inferred by the Principle of Maximum Tsallis Entropy, Physical Review A 60(5):3461-3466. DOI: 10.1103/PhysRevA.60.3461 

[9] Jankovic, M.V. (2009). Quantum Tsallis entropy and projective measurement, arXiv, eprint arXiv:0904.3794, Bibliographic Code: 2009arXiv0904.3794J

[10] Sparavigna, A.C. (2015).  On the Generalized Additivity of Kaniadakis Entropy, International Journal of Sciences 4(2):44-48. DOI: 10.18483/ijSci.627

[11] Ourabah, K., Hamici-Bendimerad, A.H., & Tribeche, M. (2015).  Quantum Entanglement and Kaniadakis Entropy, Physica Scripta  90(4): 045101. DOI: 10.1088/0031-8949/90/4/045101


Information about this Article
This Article has not yet been peer-reviewed
This Article was published on 11th January, 2016 at 14:29:23 and has been viewed 1226 times.

Creative Commons License
This work is licensed under a Creative Commons Attribution 2.5 License.
The full citation for this Article is:
Sparavigna, A. (2016). A Short Note on Quantum Entropies. PHILICA.COM Article number 558.

<< Go back Review this ArticlePrinter-friendlyReport this Article

Website copyright © 2006-07 Philica; authors retain the rights to their work under this Creative Commons License and reviews are copyleft under the GNU free documentation license.
Using this site indicates acceptance of our Terms and Conditions.

This page was generated in 0.3104 seconds.