Processing math: 100%

Information theory

Information theory

We define a quantity I (the information) contained in a prob- ability distribution by requiring that this quantity has the following properties (Khinchine) - The information depends only on the probability distribution - The uniform distribution contains the minimum information. - If we enhance the sample space with impossible events the information does not increase. - Information is additive It is possible to show, starting from these axioms, that the information contained in a probability distribution that has N possible outcomes in the sample space, Ω, that have probabilities given by the vector p, is equal to: I(p)=kNi=1pilnpi

Syllabus Aims

  • You should be able to state the Khinchine axioms of information theory
  • You should be able to give an expression for the information contained in a uniform distribution.
  • You should be able to give an expression for the information contained in a non-uniform distribution.
  • You should be able to explain how entropy and information are related.
  • You should be able to write an expression for the entropy of a uniform distribution

Contact Details

School of Mathematics and Physics,
Queen's University Belfast,
Belfast,
BT7 1NN

Email: g.tribello@qub.ac.uk
Website: mywebsite