Equations are not being displayed properly on some articles. We hope to have this fixed soon. Our apologies.

Sparavigna, A. (2015). Conditional Kaniadakis Entropy: a Preliminary Discussion. PHILICA.COM Article number 524.

ISSN 1751-3030  
Log in  
Register  
  1209 Articles and Observations available | Content last updated 23 October, 06:12  
Philica entries accessed 3 371 449 times  


NEWS: The SOAP Project, in collaboration with CERN, are conducting a survey on open-access publishing. Please take a moment to give them your views

Submit an Article or Observation

We aim to suit all browsers, but recommend Firefox particularly:

Conditional Kaniadakis Entropy: a Preliminary Discussion

Amelia Carolina Sparavignaunconfirmed user (Department of Applied Science and Technology, Politecnico di Torino)

Published in compu.philica.com

Abstract
Conditional entropies are fundamental for evaluating the mutual information of random variables. These entropies must be properly defined in the case of nonadditive entropies. Here, we propose the conditional entropy for one of them, the Kaniadakis entropy. Keywords: Mutual Information, Entropy, Kaniadakis Entropy, Generalized additivity.

Article body

 

Conditional Kaniadakis Entropy: a Preliminary Discussion

 

Amelia Carolina Sparavigna

Department of Applied Science and Technology, Politecnico di Torino, Torino, Italy

Conditional entropies are fundamental for evaluating the mutual information of  random variables. These entropies must be properly defined in the case of  nonadditive entropies. Here, we propose the conditional entropy for one of them, the Kaniadakis entropy.  

Keywords: Mutual Information, Entropy, Kaniadakis Entropy, Generalized additivity.

 

An entropy is additive when the entropy of the union of two independent subsystems, X and Y, is the sum of the subsystems entropies, that is S(X,Y)=S(X)+S(Y). Among the generalized entropies, there are some for which this additivity does not hold. Two nonadditive entropies are the Tsallis and the Kaniadakis entropies [1-3]. The use of them is quite interesting for applications, because they have entropic indices that can be used to tune behaviour of the different contributing variables (see for instance their use in image processing [4]). Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy, introduced in 1988 as a basis for generalizing the standard statistical mechanics, whereas the Kaniadakis entropy, also known as κ-entropy, emerged in the context of the special relativity. Both entropies possess a generalized sum [5].

When nonadditive entropies are involved, in the calculus of the mutual information, the conditional entropies must be properly defined. Let us remember that the mutual information I(X;Y) of two random variables X,Y is providing a measure of the mutual dependence of the variables [6]. If X and Y are independent, knowing X does not give any information about Y and vice versa: the mutual information is zero.

The conditional entropy (or equivocation) is quantifying the information needed to describe the outcome of a random variable Y if  the value of another random variable X is known: it is written as H(Y|X). Let us assume the joint entropy H(X,Y) for the combined system determined by two random variables X and Y. We need H(X,Y) “bits of information” to describe its exact state [7]. If we first learn the value of X, we have gained H(X) bits of information. “Once X is known, we only need H(X,Y)−H(X) bits to describe the state of the whole system” [7]. This quantity is exactly H(Y|X), which gives the chain rule of conditional entropy: H(Y|X)=H(X,Y)−H(X) . The mutual information is then given as  I(X;Y)=H(X)+H(Y)−H(X,Y), with the following properties, I(X;Y)=I(Y;X) and I(X;X)=H(X) [7].  When H is the Shannon entropy S and X,Y are independent, we have that S(X,Y)=S(X)+S(Y), and therefore I(X;Y)=0.

Let us to investigate the mutual information with the generalized entropies, in particular with the nonadditive entropies. In the case of the generalized entropies, it is defined by the so-called Tsallis mutual entropy [8]:

 (1)              MT(X;Y)=T(X)−T(X|Y)=T(Y)−T(Y|X)

In (1), T is referring to the Tsallis entropy. According to [8], T(X,Y)=T(X)+T(Y|X) and T(Y,X)=T(Y)+T(X|Y). Let us remember that Tsallis entropy and Rényi entropy [9] are linked:

 


 

Here q is the entropic index. We have the probabilities {pi}, where  index i is running from 1 to the total number of configurations. As q approaches 1, the Tsallis entropy becomes the Shannon entropy.

If X,Y are independent, we must have a mutual information equal to zero. Is it possible to write I(X;Y)=T(X)+T(Y)−T(X,Y), as for the Shannon entropy, and have it equal to zero? Let us calculate:

 (5)       T(X)+T(Y)−T(X,Y)=(1−q)T(X)T(Y)

Since:

(6)       T(X,Y)=T(X)+T(Y)+(1−q)T(X)T(Y)

Therefore I(X;Y) defined in this manner is different from zero. In his paper, Tsallis is discussing the problem of correlated systems too [1]. He used the Rényi entropy  for correlated systems:

 

It is easy to see that, if X,Y are independent, Γ is equal to zero. Let us note that it is function Γ which seems working as the mutual information. However, a quite useful formula was given in [10], by S. Abe and A.K. Rajagopal,. In this reference, it is the nonadditive conditional entropy, which is defined, so that:

 

 For X,Y independent variables:

 


Let us consider, instead of the Tsallis entropy T, the  Kaniadakis entropy K=Sκ (κ is the entropic index):

 

We have the probabilities {pi}, where index i is running from 1 to the total number of configurations. The generalized additivity is discussed in [11] and [5]. As in the case of the Tsallis entropy, we have to be careful because of its generalized additivity.  Following the approach of Ref.10, here we propose, for Kaniadakis entropy,  a nonadditive conditional entropy and a mutual entropy as in the following formulas:


When X,Y are independent:

 

From (11) and (12), it is more clear the role of the auxiliary function which is necessary for the Kaniadakis generalized additivity (we have also discussed this function is [12]). Further studies are in progress on this conditional entropy and for evaluating the conditional Kaniadakis entropies for  multivariate problems.

 

References

[1] C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, Journal of Statistical Physics, 1988, 52: 479.

[2] G. Kaniadakis, Statistical mechanics in the context of special relativity, Phys. Rev. E, 2002, 66:056125.

[3] G. Kaniadakis,  Theoretical foundations and mathematical formalism of the power-law tailed statistical distributions, Entropy, 2013, 15:3983.

[4] A.C. Sparavigna, Shannon, Tsallis And Kaniadakis entropies in bi-level image thresholding, Int. J. Sci., 2015, 4(2):35; DOI: 10.18483/ijSci.626

[5] A.C. Sparavigna, On the generalized additivity of Kaniadakis entropy, Int. J. Sci., 2015, 4(2):44; DOI: 10.18483/ijSci.627

[6] Vv. Aa., Mutual information, Wikipedia, Retrieved 3 October 2015.

[7] Vv. Aa., Conditional entropy, Wikipedia, Retrieved 3 October 2015.

[8] S. Furuichi, Information theoretical properties of Tsallis entropies,  J. Math. Phys. 2006, 47:023302.

[9] A. Rényi, On measures of information and entropy, Proceedings of the fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960. pp. 547–561.

[10] S. Abe, A.K. Rajagopal, Nonadditive conditional entropy and its significance for local realism, arXiv:quant-ph/0001085, 24 Jan 2000.

[11] A.M. Scarfone, T. Wada, Thermodynamic equilibrium and its stability for microcanonical systems described by the Sharma-Taneja-Mittal entropy, 2005, Phys. Rev. E 72:026123.

[12] Sparavigna, A. (2015). Kaniadakis Entropy and Images. PHILICA.COM Article number 462.

 

 

Information about this Article
This Article has not yet been peer-reviewed
This Article was published on 5th October, 2015 at 17:49:39 and has been viewed 1586 times.

Creative Commons License
This work is licensed under a Creative Commons Attribution 2.5 License.
The full citation for this Article is:
Sparavigna, A. (2015). Conditional Kaniadakis Entropy: a Preliminary Discussion. PHILICA.COM Article number 524.


<< Go back Review this ArticlePrinter-friendlyReport this Article


1 Author comment added 29th October, 2015 at 19:12:24

Detailed discussions of mutual entropies in the formalisms of the generalized entropies of Tsallis and Kaniadakis are proposed in the following two articles:

A.C. Sparavigna (2015). Mutual Information and Nonadditive Entropies: The Case of Tsallis Entropy, International Journal of Sciences 4(10):1-4 DOI: 10.18483/ijSci.845

A.C. Sparavigna (2015). Mutual Information and Nonadditive Entropies: A Method for Kaniadakis Entropy, International Journal of Sciences 4(10):5-8 DOI: 10.18483/ijSci.846


2 Author comment added 10th November, 2015 at 19:20:16

Formulas of rigorous conditional enntropy is given in the paper:
A.C. Sparavigna (2015). Relations Between Tsallis and Kaniadakis Entropic Measures and Rigorous Discussion of Conditional Kaniadakis Entropy, International Journal of Sciences 4(10):47-50 DOI: 10.18483/ijSci.866




Website copyright © 2006-07 Philica; authors retain the rights to their work under this Creative Commons License and reviews are copyleft under the GNU free documentation license.
Using this site indicates acceptance of our Terms and Conditions.

This page was generated in 0.3659 seconds.