International Journal of Advance Interdisciplinary Research

ISSN(Online):3107-913X

Exploring Information Measures: A Comparative Study of Entropies

Authors:Dr. Pratima Singh

 

Abstract

Entropy is a fundamental concept in information theory and statistical mechanics, quantifying uncertainty, diversity, or disorder in a system. While Shannon entropy is the classical measure, generalized entropies such as Rényi, Tsallis, and Sharma–Mittal offer flexible frameworks for complex systems exhibiting non-extensive, multifractal, or correlated behavior. This paper presents a comparative analysis of these entropies, highlighting their mathematical formulations, properties, and applications in diverse domains.

Download Full Article: Click Here