Equations are not being displayed properly on some articles. We hope to have this fixed soon. Our apologies.

Sparavigna, A. (2015). Calculating Mutual Information Using Kaniadakis Entropy. PHILICA.COM Article number 537.

ISSN 1751-3030  
Log in  
Register  
  1267 Articles and Observations available | Content last updated 9 December, 14:14  
Philica entries accessed 3 481 672 times  


NEWS: The SOAP Project, in collaboration with CERN, are conducting a survey on open-access publishing. Please take a moment to give them your views

Submit an Article or Observation

We aim to suit all browsers, but recommend Firefox particularly:

Calculating Mutual Information Using Kaniadakis Entropy

Amelia Carolina Sparavignaunconfirmed user (Department of Applied Science and Technology, Politecnico di Torino)

Published in compu.philica.com

Abstract
Mutual information is a quantity that can be easily obtained when Shannon entropies are used, but, if nonadditive entropies are involved, its evaluation is more complex. Among nonadditive entropies, we have the Kaniadakis entropy. In this paper, we propose some example of a mutual function, which can be tuned by an entropic index, that we can use for calculating mutual information.

Article body

 

 

Calculating Mutual Information Using Kaniadakis Entropy

 

 
 

Amelia Carolina Sparavigna

Department of Applied Science and Technology, Politecnico di Torino, Torino, Italy

 

Abstract: Mutual information is a quantity that can be easily obtained when Shannon entropies are used, but, if nonadditive entropies are involved, its evaluation is more complex. Among nonadditive entropies, we have the Kaniadakis entropy. In this paper, we propose some example of a mutual function, which can be tuned by an entropic index, that we can use for calculating mutual information.  

Keywords: Mutual Information, Entropy, Tsallis Entropy, Kaniadakis Entropy, Generalized Additivity, Image Registration.

 

In probability and information theory, the mutual information of two random variables X,Y is a measure of their mutual dependence. It can be obtained from a sum (with signs) of entropies:  I(X;Y)=H(X)+H(Y)−H(X,Y) [1,2]. H is the entropy. H(X,Y) is the joint entropy.  If X and Y are independent subsystems of random variables, the mutual information I(X;Y) must be zero. When we use the Shannon entropy, which is additive, we have H(X,Y)=H(X)+H(Y), and then I(X;Y)=0. If we are using a nonadditive entropy measure, the approach to find the mutual information is not so simple.

Among the nonadditive entropies, there are two entropies, proposed by C. Tsallis and by G. Kaniadakis. Both entropies are quite interesting for applications, because they have entropic indices that can be used to tune the behaviour of different contributing variables. Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy, introduced in 1988 as a basis for generalizing the standard statistical mechanics [3], whereas the Kaniadakis entropy emerged in the context of the special relativity [4,5].  Here, we propose some examples of the use of Kaniadakis entropy for evaluating mutual information (a detailed discussion of this quantity for Tsallis and Kaniadakis entropies is given in [6,7]).

The  Kaniadakis entropy K is also known as κ-entropy (κ is the entropic index). When this entropy is used, besides the entropy measure we need another function for expression the generalized additivity. Entropy measure and this function are:

 

 

  In (1), {pi} are the probabilities, and index i is running from 1 to the total number of configurations.  

Starting from I(X;Y)=H(X)+H(Y)−H(X,Y), we can define a Tsallis mutual entropy, conveniently renormalized, in the following manner [6]:

 

In this equation, T is the Tsallis entropy. Eq.(2) is obtained in [6], using the conditional entropies [8]:

 

 

 MT(X;Y) fulfils features  of mutual information. If X,Y are independent, we have I(X;Y)=0.

Let us use the κ-entropy instead of the Tsallis entropy, and define the Kaniadakis conditional entropies as [9]:

 

 

 Repeating the same approach for Tsallis entropy, we can obtain a renormalized mutual Kaniadakis entropy as [7]:

 

 

 

Details on (2) and (5) and how the renormalization have been determined are the subject of papers [6,7]). Therefore, let us consider here MK(X;Y) and calculate mutual information in two cases. Of course, when X and Y independent, (5) gives zero.  

X and Y with complete dependency:

 

 From the calculus, we find that Y is completely dependent on X, so that the entropy (information) of Y is also the mutual information.

 

             X and Y are dependent:

 

 

 From the calculus, we find that Y is dependent on X, so that MK(X;Y) is different from zero.  

Let us remark that, when the entropic index κ is very small, the result is that obtained using  the Shannon entropy. The images are giving the single entropies, the joint entropy and the mutual entropy. In a future work, the Kaniadakis mutual information will be proposed for being used in image registration, the process of aligning two or more images of the same scene.  

 References

[1] J.W. Kay, D. M. Titterington, Statistics and Neural Networks: Advances at the Interface, Oxford University Press, 1999

[2] R. W. Hamming, Coding and Information Theory, Prentice-Hall, 1980

[3] C. Tsallis (1988). Possible Generalization of Boltzmann-Gibbs Statistics, Journal of Statistical Physics  52:479. DOI:10.1007/BF01016429

[4] G. Kaniadakis (2002). Statistical mechanics in the context of special relativity, Phys. Rev. E 66: 056125.  DOI: 10.1103/physreve.66.056125

[5] G. Kaniadakis (2013).  Theoretical foundations and mathematical formalism of the power-law tailed statistical distributions, Entropy 15:3983. DOI: 10.3390/e15103983

[6] A.C. Sparavigna (2015). Mutual Information and Nonadditive Entropies: The Case of Tsallis Entropy, International Journal of Sciences 4(10):1-4. DOI: 10.18483/ijSci.845

[7] A.C. Sparavigna (2015). Mutual Information and Nonadditive Entropies: A Method for Kaniadakis Entropy, International Journal of Sciences 4(10):5-8. DOI: 10.18483/ijSci.846

[8] S. Abe, A.K. Rajagopal (2000). Nonadditive conditional entropy and its significance for local realism, arXiv:quant-ph/0001085, 24 Jan 2000.

[9] A.C. Sparavigna (2015). Conditional Kaniadakis Entropy: a Preliminary Discussion. PHILICA.COM Article number 524.

 

 

Information about this Article
This Article has not yet been peer-reviewed
This Article was published on 30th October, 2015 at 12:48:18 and has been viewed 1699 times.

Creative Commons License
This work is licensed under a Creative Commons Attribution 2.5 License.
The full citation for this Article is:
Sparavigna, A. (2015). Calculating Mutual Information Using Kaniadakis Entropy. PHILICA.COM Article number 537.


<< Go back Review this ArticlePrinter-friendlyReport this Article


1 Author comment added 10th November, 2015 at 19:21:44

Derivation of conditional entropy is given in the paper:
A.C. Sparavigna (2015). Relations Between Tsallis and Kaniadakis Entropic Measures and Rigorous Discussion of Conditional Kaniadakis Entropy, International Journal of Sciences 4(10):47-50 DOI: 10.18483/ijSci.866




Website copyright © 2006-07 Philica; authors retain the rights to their work under this Creative Commons License and reviews are copyleft under the GNU free documentation license.
Using this site indicates acceptance of our Terms and Conditions.

This page was generated in 0.4681 seconds.