## Book Volume 1

#### List of Contributors

Page: v-ix (5)

Author: Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

#### Introduction - Summary of Contents

Page: x-xiii (4)

Author: Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

#### Heat and Entropy: A Brief History

Page: 3-29 (27)

Author: Evaldo M. F. Curado, Andres M. Kowalski and Raul D. Rossignoli

PDF Price: $15

##### Abstract

We provide a brief description of the historical development of the concept of Entropy, from its origins related to heat to its modern interpretation as a special measure of information.

#### Essentials of Information Entropy and Related Measures

Page: 30-56 (27)

Author: Raul D. Rossignoli, Andres M. Kowalski and Evaldo M. F. Curado

PDF Price: $15

##### Abstract

This introductory chapter provides a basic review of the Shannon entropy and of some important related quantities like the joint entropy, the conditional entropy, the mutual information and the relative entropy. We also discuss the Fisher information, the fundamental property of concavity, the basic elements of the maximum entropy approach and the definition of entropy in the Quantum case. We close this chapter with the axioms which determine the Shannon entropy and a brief description of other information measures.

#### The Nonadditive Entropy Sq: A Door Open to the Nonuniversality of the Mathematical Expression of the Clausius Thermodynamic Entropy in Terms of the Probabilities of the Microscopic Configurations

Page: 57-80 (24)

Author: Constantino Tsallis

PDF Price: $15

##### Abstract

Clausius introduced, in the 1860s, a thermodynamical quantity which he named entropy
S. This thermodynamically crucial quantity was proposed to be extensive, i.e., in
contemporary terms, S(N) ∝ N in the thermodynamic limit N →∞. A decade later,
Boltzmann proposed a functional form for this quantity which connects S with the occurrence
probabilities of the microscopic configurations (referred to as complexions at
that time) of the system. This functional is, if written in modern words referring to a
system with W possible discrete states, S_{BG} = −k_{B} ∑^{w}_{i}=1 pi ln pi, with ∑^{W}_{i}=1 pi=1,
k_{B} being nowadays called the Boltzmann constant (BG stands for Boltzmann-Gibbs, to
also acknowledge the fact that Gibbs provided a wider sense for W). The BG entropy
is additive, meaning that, if A and B are two probabilistically independent systems,
then S_{BG}(A+B) = S_{BG}(A)+S_{BG}(B). These two words, extensive and additive, were
practically treated by physicists, for over more than one century, as almost synonyms,
and S_{BG} was considered to be the unique form that S could take. In other words, the
functional SBG was considered to be universal. It has become increasingly clear today
that it is not so, and that those two words are not synonyms, but happen to coincide
whenever we are dealing with paradigmatic Hamiltonians involving short-range interactions
between their elements, presenting no strong frustration and other “pathologies”.
Consistently, it is today allowed to think that the entropic functional connecting S
with the microscopic world transparently appears to be nonuniversal, but is rather dictated
by the nature of possible strong correlations between the elements of the system.
These facts constitute the basis of a generalization of the BG entropy and statistical
mechanics, introduced in 1988, and frequently referred to as nonadditive entropy Sq and
nonextensive statistical mechanics, respectively. We briefly review herein these points,
and exhibit recent as well as typical applications of these concepts in natural, artificial,
and social systems, as shown through theoretical, experimental, observational and
computational predictions and verifications.

#### What do Generalized Entropies Look Like? An Axiomatic Approach for Complex, Non-Ergodic Systems

Page: 81-99 (19)

Author: Stefan Thurner and Rudolf Hanel

PDF Price: $15

##### Abstract

Shannon and Khinchin showed that assuming four information theoretic axioms the
entropy must be of Boltzmann-Gibbs type, S = ∑_{i} pi log pi. Here we note that in
physical systems one of these axioms may be violated. For non-ergodic systems the
so called separation axiom (Shannon-Khinchin axiom 4) will in general not be valid.
We show that when this axiom is violated the entropy takes a more general form,
S_{c,d} ∝ ∑^{w}_{i} Γ(d + 1, 1 − c log pi), where c and d are scaling exponents and Γ a, b) is
the incomplete gamma function. The exponents (c, d) define equivalence classes for all
interacting and non-interacting systems and unambiguously characterize any statistical
system in its thermodynamic limit. The proof is possible because of two newly
discovered scaling laws which any entropic form has to fulfill, if the first three Shannon-
Khinchin axioms hold. (c, d) can be used to define equivalence classes of statistical
systems. A series of known entropies can be classified in terms of these equivalence
classes. We show that the corresponding distribution functions are special forms of
Lambert-W exponentials containing – as special cases – Boltzmann, stretched exponential
and Tsallis distributions (power-laws). In the derivation we assume trace form
entropies, S = ∑_{i} g(pi), with g some function, however more general entropic forms
can be classified along the same scaling analysis.

#### Majorization and Generalized Entropies

Page: 100-129 (30)

Author: Norma Canosa and Raul D. Rossignoli

PDF Price: $15

##### Abstract

We review the concept of majorization and its relation with generalized information measures. Majorization theory provides an elegant framework for comparing two probability distributions, leading to a rigorous concept of disorder which is more stringent than that based on the Shannon entropy. Nevertheless, it is shown that it can be fully captured through general entropic inequalities based on generalized entropic forms. A brief review of generalized entropies is also provided. As illustration, we discuss the majorization properties of generalized thermal distributions derived from generalized entropies, and identify rigorous mixing parameters. We also describe majorization in quantum systems. We discuss in particular its capability for providing a disorder based criterion for the detection of quantum entanglement, which is stronger than that based on the von Neumann entropy and leads to a generalized entropic separability criterion.

#### Distances Measures for Probability Distributions

Page: 130-146 (17)

Author: Pedro W. Lamberti and Ana P. Majtey

PDF Price: $15

##### Abstract

Many problems of statistical and quantum mechanics can be established in terms of a distance between probability distributions. The present work is devoted to review some of the most relevant applications of the notion of distance in the context of statistical mechanics, quantum mechanics and information theory. Although we make a general overview of the most frequently used distances between probability distributions, we will center our presentation in a distance known as the Jensen-Shannon divergence both in their classical and quantum versions. For the classical one we present its main properties and we discuss its relevance as a segmentation tool for symbolic sequences. In the quantum case we show that the quantum Jensen-Shannon divergence is an adequate measure of entanglement.

#### A Statistical Measure of Complexity

Page: 147-168 (22)

Author: Ricardo Lopez-Ruiz, Hector Mancini and Xavier Calbet

PDF Price: $15

##### Abstract

In this chapter, a statistical measure of complexity is introduced and some of its properties are discussed. Also, some straightforward applications are shown.

#### Generalized Statistical Complexity: A New Tool for Dynamical Systems

Page: 169-215 (47)

Author: Osvaldo A. Rosso, Maria Teresa Martin, Hilda A. Larrondo, Andres M. Kowalski and Angelo Plastino

PDF Price: $15

##### Abstract

A generalized Statistical Complexity Measure (SCM) is a functional of the probability distribution P associated with the time series generated by a given dynamical system. A SCM is the composition of two ingredients: i) an entropy and ii) a distance in the probability-space. We address in this review important topics underlying the SCM structure, viz., a) the selection of the information measure I; b) the choice of the probability metric space and associated distance D, which in this context is called a “disequilibrium” (denoted with the letter Q). Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. c) The adequate way of picking up the probability distribution P associated with a dynamical system or time series under study, which is indeed a fundamental problem. A good analysis of this topics is essential to get a SCM that quantifies not only randomness but also the presence of correlational structures. In this chapter we specially stress how sensible improvements in the final results can be obtained if the underlying probability distribution is “extracted” via appropriate considerations regarding causal effects in the system’s dynamics. As an illustration, we show just how these issues affect the description of the celebrated logistic map.

#### The Fisher Information: Properties and Physico-Chemical Applications

Page: 216-233 (18)

Author: Jesus S. Dehesa, Rodolfo O. Esquivel, Angel Ricardo Plastino and Pablo Sanchez-Moreno

PDF Price: $15

##### Abstract

The Fisher information functional occupies a very special position among the uncertainty measures of quantum systems of any dimensionality. In contrast with (global) entropic measures of Shannon, Renyi or Tsallis types, it contitutes a local measure in the sense that it is sensitive to the oscillatory character of the probability density distribution. Moreover, the Fisher information is closely related to various macroscopic properties of the system described by different density functionals. Here, we review its main analytical properties such as, inequality-based relationships with some radial expectation values and its uncertainty relation. Tighter versions of these inequalities are shown for those systems subject to an arbitrary central potential. Moreover, the Fisher information is computed for D-dimensional hydrogenic systems explicitly in terms of the radial and angular hyperquantum numbers characterizing the quantum states of the system in both position and momentum spaces. Finally, the utility of this quantity is discussed for various physico-chemical processes of recent interest: Abstraction and nucleophilic substitution reactions.

#### Entanglement and Entropy

Page: 234-255 (22)

Author: J. Batle, A. Plastino, A. R. Plastino and M. Casas

PDF Price: $15

##### Abstract

We review links between the entanglement concept and the entropic-information one, as represented by different measures like the Shannon and Tsallis ones. This remarkable linkage illuminates several aspects of the physics of information-theoretic activity and it is thus of importance to understand the structure of the concomitant connections

#### Semiclassical Treatments and Information Theory

Page: 256-282 (27)

Author: Flavia Pennini, Angelo Plastino and Gustavo Luis Ferri

PDF Price: $15

##### Abstract

We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well.

#### Statistical Complexity of Chaotic Pseudorandom Number Generators

Page: 283-308 (26)

Author: Hilda A. Larrondo, Luciana De Micco, Claudio M. Gonzalez, Angelo Plastino and Osvaldo A. Rosso

PDF Price: $15

##### Abstract

This chapter deals with the use of the Statistical Complexity Measure, as defined by Lopez Ruiz, Mancini and Calbet [Phys. Lett. A 209 (1995) 321–326] and modified by Rosso and coworkers [P. W. Lamberti, M. T. Martin, A. Plastino, O. A. Rosso; Physica A 334 (2004) 119–131] to characterize pseudo random number generators (PRNG’s) obtained from chaotic dynamical systems. It is shown that two probability distribution functions are required for a proper characterization: the first one is based on the histogram and is used to characterize the uniformity of the values in the time series; the second one is based on the permutation procedure proposed by Bandt and Pompe [Phys. Rev. Lett. 88 (2002) 174102] and characterize the uniformity of patterns of several consecutive values of the time series.

#### Analysis of an EL Nino-Southern Oscillation proxy record using Information Theory quantifiers

Page: 309-340 (32)

Author: Laura C. Carpi, Patricia M. Saco, Alejandra Figliola, Eduardo Serrano and Osvaldo A. Rosso

PDF Price: $15

##### Abstract

Quantifiers based on Information Theory, the normalized Shannon entropy, the Fisher information measure and the Shannon-Jensen statistical complexity are used to characterize changes in the dynamical behavior of EL Nino Southern-Oscillation (ENSO) during the Holocene. For this purpose we analyze the ENSO proxy record corresponding to Pallcacocha Lake sedimentary data for the Holoceno period (11,000 yr BP to present). By recourse of these quantifiers we found evidence of a shift in dynamics and cyclic behavior of the ENSO proxy, which is consistent with the results of Moy et al. [Nature 420 (2002) 162-165]. In addition, we have been also able to localize these cycles in time and to analyze connections to epochs of rapid climate change (RCC) during the Holocene.

#### Erythrocytes Viscoelasticity Under the lens of Wavelet–Information Theory Quantifiers

Page: 341-374 (34)

Author: Ana Maria Korol, Maria Teresa Martin, Bibiana Riquelme, Mabel D’Arrigo and Osvaldo A. Rosso

PDF Price: $15

##### Abstract

We present an application of wavelet-based Information Theory quantifiers (Relative Wavelet Energy, Normalized Total Wavelet Entropy, Wavelet MPR-Statistical Complexity and Entropy-Complexity plane) on red blood cells membrane viscoelasticity characterization. These quantifiers exhibit important localization advantages provided by the Wavelet Theory. The present approach produces a clear characterization of this dynamical system. We have found out an evident manifestation of a random process on the red cell samples of healthy individuals, which is weakly less when studying glucose incubated erythrocytes, while there is a sharp reduction of randomness on analyzing a human haematological disease, such as β-thalassemia minor.

#### Information-Theoretic Analysis of the Role of Correlations in Neural Spike Trains

Page: 375-407 (33)

Author: Fernando Montani and Simon R. Schultz

PDF Price: $15

##### Abstract

We have applied an information theoretic approach to gain insights of the role of spike correlations in the neuronal code. First, we illustrate and compare the different methods used in the literature to remove sample size dependent bias from information estimations. Then, we use a modified version of the information components breakdown to quantify the contribution of individual members of the population, the interaction between them, and the overall information encoded by the ensemble of neurons making an especial emphasis of the separation between contributions due to the noise and signal spike correlations. This formalism is applied to a set of multi-neuronal spike data with different stimuli configurations.

## Introduction

1. Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantifiers are powerful tools for the study of general time and data series independently of their sources, this book will be useful to all those doing research connected with information analysis. The tutorials in this volume are written at a broadly accessible level and readers will have the opportunity to acquire the knowledge necessary to use the information theory tools in their field of interest.