Concepts and Recent Advances in Generalized Information Measures and Statistics

1. Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have ...
[view complete introduction]

US $
15

*(Excluding Mailing and Handling)



Essentials of Information Entropy and Related Measures

Pp. 30-56 (27)

Raul D. Rossignoli, Andres M. Kowalski and Evaldo M. F. Curado

Abstract

This introductory chapter provides a basic review of the Shannon entropy and of some important related quantities like the joint entropy, the conditional entropy, the mutual information and the relative entropy. We also discuss the Fisher information, the fundamental property of concavity, the basic elements of the maximum entropy approach and the definition of entropy in the Quantum case. We close this chapter with the axioms which determine the Shannon entropy and a brief description of other information measures.

Keywords:

Shannon Entropy, Mutual Information, Relative Entropy, Fisher Information, von Neumann Entropy.

Affiliation:

Departamento de Fisica-IFLP, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, C.C. 67, 1900 La Plata, Argentina