Equations are not being displayed properly on some articles. We hope to have this fixed soon. Our apologies. ISSN 1751-3030   Log in   Register 1203 Articles and Observations available | Content last updated 19 October, 04:02 Philica entries accessed 3 366 653 times
 NEWS: The SOAP Project, in collaboration with CERN, are conducting a survey on open-access publishing. Please take a moment to give them your viewsWe aim to suit all browsers, but recommend Firefox particularly:

Amelia Carolina Sparavigna (Department of Applied Science and Technology, Politecnico di Torino)

Published in compu.philica.com

Abstract
Entropy has a relevant role in several applications of information theory and in the image processing. Here, we discuss the Kaniadakis entropy for images. An example of bi-level image thresholding obtained by means of this entropy is also given. Keywords: Kaniadakis Entropy, Data Segmentation, Image processing, Thresholding.

Article body

Introduction

In the last twelve years several researches had been made on foundations and applications of the generalized statistical theory based on the κ-distribution of probabilities [1]. From this distribution, we can have an entropy, the κ-entropy, also known as the Kaniadakis entropy, named after Giorgio Kaniadakis, Politecnico di Torino, who proposed the generalized statistics [2]. Like the well-known Tsallis entropy [3], the κ-entropy becomes that given by Shannon in 1948,  when its entropic index κ is going to zero.  Shannon and Tsallis entropies are largely used for bi-level and multi-level thresholding in image processing [4-8]. It is therefore interesting to define Kaniadakis entropy for this purpose too.

The Kaniadakis entropy, also known as κ-entropy [1,2], is given as:

In (1), we have the κ-logarithm of probabilities {pi}. The index i is running from 1 to the total number of configurations. This entropy has a generalized additivity. Let us have two independent systems A and B, with κ-entropies SκA and SκB. In the limit κ0, Kaniadakis entropy becomes Shannon entropy and therefore we expect finding the common Shannon additivity of entropies.

According to [9], in  κ-calculus, the generalized sum of entropies is:

In general, Iκ (let us call it “function I ”) is given by:

Therefore,  we have:

In the limit κ0, Kaniadakis entropy becomes Shannon entropy, and therefore we must have the normal additivity:

In generalizing to three systems [10], we have for instance:

To evaluate the Kaniadakis entropy for gray-level images, we have simply to use the histogram of gray tones (see [4-8] for Shannon and Tsallis). In this manner, index i is indicating the gray level; usually, it is going from zero to 255. In the Figure 1 we are giving the behavior of Sκ,Iκ and ratio Rκ = Sκ / Iκ  as a function of the entropic index κ .

Figure 1: The image is Cameraman. On the right, its histogram. The plot gives the behavior of entropy S,  function I and ratio R as a function of the entropic index. Note that functions are symmetric.

An example of bi-level thresholding

In Ref.11, the method for a bi-level thresholding using Kaniadakis entropy is discussed and the results  compared to that given by Tsallis entropy (the two approaches compare positively). In a bi-level thresholding, the gray-level input image transforms into a bi-level black and white image, according to a given threshold. Pixels having a gray-tone above the threshold become white, those having a gray-tone below or equal the threshold become black. Of course, the output image depends on the value of the threshold.

Methods [5-7] for thresholding an image determine the best value of the threshold by maximizing Shannon or Tsallis entropy. Both Tsallis and Kaniadakis entropies have entropic indices that can give different results when the maximum entropy method is applied to an image. To show this, let us consider an example from a microscope image of a blood film. Figure 2 gives the original image and three bi-level images obtained with different values of κ entropic index. The image for this index equal to 0.01, is quite the result we can have using Shannon entropy. In this image we cannot see the red cells of blood, but, with an entropic index equal to 2, we have a quite different image. The use of a series of images obtained with different entropic indices is then important for segmentation. Let us conclude, remarking also that Kaniadakis entropy has the intuitive behavior of  recovering the Shannon entropy, when its entropic index is going to zero.

Figure 2: Bi-level thresholding obtained by means of Kaniadakis entropy, using three different values of the entropic index. For the index equal to 0.01, we have the Shannon limit. Note that, in this case, we cannot see the red cells of blood. These cells are well-defined in the image obtained when  index is equal to 2.

References

1.    Kaniadakis, G. Theoretical foundations and mathematical formalism of the power-law tailed statistical distributions, Entropy, 2013, 15, 3983-4010.

2.    Kaniadakis, G. Statistical mechanics in the context of special relativity, Phys. Rev. E, 2002, 66, 056125.

3.    Tsallis, C. Introduction to nonextensive statistical mechanics, 2009, Springer.

4.    Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics, 1988, 52, 479–487.

5.    Gull, S.F.; Skilling, J. Maximum entropy method in image processing. Communications, Radar and Signal Processing, IEE Proceedings F, 1984, 131, 646-659.

6.    Portes de Albuquerque, M.; Esquef, I.A.; Gesualdi Mello, A.R.; Portes de Albuquerque, M.  Image thresholding using Tsallis entropy. Pattern Recognition Letters, 2004, 25, 1059-1065.

7.    Kapur, J.N.; Sahoo, P.K.; Wong, A.K.C. A new method for gray-level picture thresholding using the entropy of the histogram. Comput. Vision Graphics Image Process., 1985, 29, 273-285.

8.    Jaynes, E.T. Where do we go from here?, in C. Ray Smith and W.T. Grandy, Jr. (eds.), Maximum-entropy and Bayesian methods in inverse problems, 21-58, 1985, D. Reidel Publishing Company.

9.    Scarfone, A.M.; Wada, T. Thermodynamic equilibrium and its stability for microcanonical systems described by the Sharma-Taneja-Mittal entropy, 2005, Phys. Rev. E 72, 026123.

10.  Sparavigna, A.C. On the generalized additivity of Kaniadakis entropy, International Journal of Sciences, 2015, in print.

11.  Sparavigna, A.C. Shannon, Tsallis and Kaniadakis entropies in bi-level image thresholding, International Journal of Sciences, 2015, in print.

This Article was published on 28th February, 2015 at 11:29:41 and has been viewed 2491 times.

1 Author comment added 27th October, 2015 at 09:48:14

References 10 and 11 have been printed:

[10] A.C. Sparavigna (2015). On The Generalized Additivity Of Kaniadakis Entropy, International Journal of Sciences 4(2):44-48 DOI: 10.18483/ijSci.627

[11] A.C. Sparavigna (2015). Shannon, Tsallis And Kaniadakis Entropies In Bi-level Image Thresholding, International Journal of Sciences 4(2):35-43 DOI: 10.18483/ijSci.626

2 Author comment added 27th October, 2015 at 09:48:53

References 10 and 11 have been printed:

[10] A.C. Sparavigna (2015). On The Generalized Additivity Of Kaniadakis Entropy, International Journal of Sciences 4(2):44-48 DOI: 10.18483/ijSci.627

[11] A.C. Sparavigna (2015). Shannon, Tsallis And Kaniadakis Entropies In Bi-level Image Thresholding, International Journal of Sciences 4(2):35-43 DOI: 10.18483/ijSci.626

3 Author comment added 20th March, 2016 at 07:34:39

Kaniadakis entropy is, like Tsallis entropy, a generalized entropy. Some relations are existing between them, such as those which are giving the conditional Kaniadakis entropy. These relations had been deduced in the paper:

Sparavigna, A. C. (2015). Relations Between Tsallis and Kaniadakis Entropic Measures and Rigorous Discussion of Conditional Kaniadakis Entropy, International Journal of Sciences, 4(10):47-50 DOI: 10.18483/ijSci.866

 Website copyright © 2006-07 Philica; authors retain the rights to their work under this Creative Commons License and reviews are copyleft under the GNU free documentation license.Using this site indicates acceptance of our Terms and Conditions. This page was generated in 0.3069 seconds.