One of the main issues in communications theory is measuring the ultimate data compression possible using the concept of entropy. While differential entropy may seem to be a simple extension of the discrete case, it is a more complex measure that often requires a more careful treatment.
Handbook of Differential Entropy provides a comprehensive introduction to the subject for researchers and students in information theory. Unlike related books, this one brings together background material, derivations, and applications of differential entropy.
The handbook first reviews probability theory as it enables an understanding of the core building block of entropy. The authors then carefully explain the concept of entropy, introducing both discrete and differential entropy. They present detailed derivations of differential entropy for numerous probability models and discuss challenges with interpreting and deriving differential entropy. They also show how differential entropy varies as a function of the model variance.
Focusing on the application of differential entropy in several areas, the book describes common estimators of parametric and nonparametric differential entropy as well as properties of the estimators. It then uses the estimated differential entropy to estimate radar pulse delays when the corrupting noise source is non-Gaussian and to develop measures of coupling between dynamical system components.