Equations are not being displayed properly on some articles. We hope to have this fixed soon. Our apologies.

Sparavigna, A. (2015). Some Relations Among Entropy Measures. PHILICA.COM Article number 536.

ISSN 1751-3030  
Log in  
Register  
  1203 Articles and Observations available | Content last updated 19 October, 04:02  
Philica entries accessed 3 363 956 times  


NEWS: The SOAP Project, in collaboration with CERN, are conducting a survey on open-access publishing. Please take a moment to give them your views

Submit an Article or Observation

We aim to suit all browsers, but recommend Firefox particularly:

Some Relations Among Entropy Measures

Amelia Carolina Sparavignaunconfirmed user (Department of Applied Science and Technology, Politecnico di Torino)

Published in compu.philica.com

Abstract

Article body

 

Some Relations Among Entropy Measures

 

Amelia Carolina Sparavigna

Department of Applied Science and Technology, Politecnico di Torino, Torino, Italy

 

Abstract: Several entropies are generalizing the Shannon entropy and have it as their limit as entropic indices approach specific values. Here we discuss some relations existing among  Rényi, Tsallis and Kaniadakis entropies and show how the Shannon entropy becomes the limit of Kaniadakis entropy.  

Keywords: Entropy, Generalized Entropies.

 

In information theory, measures of information can be obtained from the probability distribution of some events contained in a sample set of possible events. These measures are the entropies. In 1948 [1], Claude Shannon defined the entropy Η of a discrete random variable X as H(X) =Σi p(xi)I(xi) =−Σi p(xi) logb p(xi). In this expression, the probability of i-event is pi and b is the base of the used logarithm. Common values of the base are 2, Euler’s number e, and 10.

Besides Shannon entropy, several other entropies are used in information theory; here we will discuss a few of them, stressing how they are linked. These generalized entropies are Rényi, Tsallis and Kaniadakis (also known as κ-entropy) entropies [2-4]. In the following formulas we can see how these entropies are defined, with a corresponding choice of measurement units equal to 1:


In (2)-(4) we have entropic indices q and κ, and:

 


Let us consider the joint entropy H(A,B) of two independent subsystems A,B. We have:

 


Note that for the generalized additivity of κ-entropy, we need another  function containing probabilities (see [5] and references therein). Let us find relations between  R and T entropies. From (3), we have:

 

 

Moreover:

 

We can find these relations in [3]. Let us consider, instead of Tsallis T,  κ-entropy K. We have that:

 


 Moreover:


 

Since Tsallis entropy T is related to Rényi entropy, we can easily find the relation of κ- to Rényi entropy too.

 

 

 


 Of course, we can find (13) and (14) directly from the expression of Rényi measure:

 


 Let us note that, when κ0, R becomes Shannon entropy. In this manner we can easily see that: 

 


Let us consider also:

These two equations are telling that Kaniadakis functions must be even functions of index κ. 

Let us conclude with an example. We can consider the histogram h of the grey tone pixels of an image, for using it as an experimental probability p distribution. From it, we can calculate κ-entropy of an image and the function required by the generalized additivity (8), for entropic index κ ranging from 0 to 1. Comparing them to (17) and (18), we obtain plots given in the Figures 1 and 2. It is interesting to note that κ-entropy, for low values of its entropic index, is a hyperbolic sine. The function, which appears in the generalized additivity, behaves as an hyperbolic cosine.  

 

Figure 1: Comparison of κ-entropy functions to (19) and (20).

 

Figure 2: Entropy K and K* for low values of the entropic index.

 References

[1] Shannon, C.E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal 2 (3):379–423. DOI: 10.1002/j.1538-7305.1948.tb01338.x

[2] Rényi, A. (1961). On Measures of Information and Entropy. Proceedings of the Fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960. pp. 547–561.

[3] Tsallis, C. (1960). Possible Generalization of Boltzmann-Gibbs Statistics, Journal of Statistical Physics, 1988, 52: 479–487. DOI:10.1007/BF01016429 See also:

http://www.iaea.org/inis/collection/NCLCollectionStore/_Public/19/064/19064272.pdf

[4] Kaniadakis, G.(2002).  Statistical Mechanics in the Context of Special Relativity, Phys. Rev. E, 2002, 66, 056125. DOI: 10.1103/physreve.66.056125

[5] Sparavigna, A.C. (2015).  On the Generalized Additivity of Kaniadakis Entropy, Int. J. Sci. 4(2):44-48. DOI: 10.18483/ijSci.627

.

Information about this Article
This Article has not yet been peer-reviewed
This Article was published on 29th October, 2015 at 17:35:03 and has been viewed 1518 times.

Creative Commons License
This work is licensed under a Creative Commons Attribution 2.5 License.
The full citation for this Article is:
Sparavigna, A. (2015). Some Relations Among Entropy Measures. PHILICA.COM Article number 536.


<< Go back Review this ArticlePrinter-friendlyReport this Article


1 Author comment added 3rd November, 2015 at 19:03:51

A discussion of a general version of Eq.11 is available in the appendix of the following paper: Non-linear kinetics underlying generalized statistics, Physica A, G. Kaniadakis, 2001, vol. 296, pages 405 -425


2 Author comment added 12th November, 2015 at 07:44:14

Further discussions published in:

“Relations Between Tsallis and Kaniadakis Entropic Measures and Rigorous Discussion of Conditional Kaniadakis Entropy”, A.C. Sparavigna
DOI: 10.18483/ijSci.866 Int. J. Sci. 4(10):47-50 - October 2015
Abstract: Tsallis and Kaniadakis entropies are generalizing the Shannon entropy and have it as their limit when their entropic indices approach specific values. Here we show some relations existing between Tsallis and Kaniadakis entropies. We will also propose a rigorous discussion of the conditional Kaniadakis entropy, deduced from these relations.




Website copyright © 2006-07 Philica; authors retain the rights to their work under this Creative Commons License and reviews are copyleft under the GNU free documentation license.
Using this site indicates acceptance of our Terms and Conditions.

This page was generated in 0.2876 seconds.