The aims of this course are to introduce the principles and applications of information
theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; the Fourier perspective;
and extensions to wavelets, complexity, compression, and efficient coding of audio-visual information.