Communications in Information and Systems

Volume 15 (2015)

Number 1

Facets of entropy

Pages: 87 – 117



Raymond W. Yeung (Institute of Network Coding, Department of Information Engineering, and Shenzhen Research Institute, Chinese University of Hong Kong, Shatin, N.T., Hong Kong)


Constraints on the entropy function are of fundamental importance in information theory. For a long time, the polymatroidal axioms, or equivalently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by nonnegativity of the Shannon information measures are categorically referred to as Shannon-type inequalities. If the number of random variables is fixed, a Shannon-type inequality can in principle be verified by a software package known as ITIP. A non-Shannon-type inequality is a constraint on the entropy function which is not implied by the nonnegativity of the Shannon information measures. In the late 1990’s, the discovery of a few such inequalities revealed that Shannon-type inequalities alone do not constitute a complete set of constraints on the entropy function. In the past decade or so, connections between the entropy function and a number of subjects in information sciences, mathematics, and physics have been established. These subjects include probability theory, network coding, combinatorics, group theory, Kolmogorov complexity, matrix theory, and quantum mechanics. This expository work is an attempt to present a picture for the many facets of the entropy function.


entropy, polymatroid, non-Shannon-type inequalities, positive definite matrix, quasi-uniform array, Kolmogorov complexity, conditional independence, network coding, quantum information theory

Published 18 November 2015