Concepts and Recent Advances in Generalized Information Measures and Statistics

1. Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have ...
[view complete introduction]

US $

*(Excluding Mailing and Handling)

What do Generalized Entropies Look Like? An Axiomatic Approach for Complex, Non-Ergodic Systems

Pp. 81-99 (19)

DOI: 10.2174/9781608057603113010009

Author(s): Stefan Thurner, Rudolf Hanel


Shannon and Khinchin showed that assuming four information theoretic axioms the entropy must be of Boltzmann-Gibbs type, S = ∑i pi log pi. Here we note that in physical systems one of these axioms may be violated. For non-ergodic systems the so called separation axiom (Shannon-Khinchin axiom 4) will in general not be valid. We show that when this axiom is violated the entropy takes a more general form, Sc,d ∝ ∑wi Γ(d + 1, 1 − c log pi), where c and d are scaling exponents and Γ a, b) is the incomplete gamma function. The exponents (c, d) define equivalence classes for all interacting and non-interacting systems and unambiguously characterize any statistical system in its thermodynamic limit. The proof is possible because of two newly discovered scaling laws which any entropic form has to fulfill, if the first three Shannon- Khinchin axioms hold. (c, d) can be used to define equivalence classes of statistical systems. A series of known entropies can be classified in terms of these equivalence classes. We show that the corresponding distribution functions are special forms of Lambert-W exponentials containing – as special cases – Boltzmann, stretched exponential and Tsallis distributions (power-laws). In the derivation we assume trace form entropies, S = ∑i g(pi), with g some function, however more general entropic forms can be classified along the same scaling analysis.


Generalized Entropies; Nonextensive Statistical Mechanics; Complex Systems; Distribution Functions.