Information theory

Information theory

We define a quantity $I$ (the information) contained in a prob- ability distribution by requiring that this quantity has the following properties (Khinchine) - The information depends only on the probability distribution - The uniform distribution contains the minimum information. - If we enhance the sample space with impossible events the information does not increase. - Information is additive It is possible to show, starting from these axioms, that the information contained in a probability distribution that has $N$ possible outcomes in the sample space, $\Omega$, that have probabilities given by the vector $\mathbf{p}$, is equal to: $$ I(\mathbf{p}) = k \sum_{i=1}^N p_i \ln p_i $$

Syllabus Aims

  • You should be able to state the Khinchine axioms of information theory
  • You should be able to give an expression for the information contained in a uniform distribution.
  • You should be able to give an expression for the information contained in a non-uniform distribution.
  • You should be able to explain how entropy and information are related.
  • You should be able to write an expression for the entropy of a uniform distribution

Contact Details

School of Mathematics and Physics,
Queen's University Belfast,
Belfast,
BT7 1NN

Email: g.tribello@qub.ac.uk
Website: mywebsite