**Course Details**

Source Coding, Hypothesis Testing and Information Measures, Typical sequences, asymptotic equipartition property, and source coding theorem. Connection between source coding theorem and a simple hypothesis testing problem. Definitions and interpretations of entropy, mutual information, inequalities on information and divergence measures.

Non-Block Coding and the Noiseless Coding Theorem, Kraft-McMillan inequality, uniquely decodable codes, and Shannon’s noiseless coding theorem

Method of Types, Composite Hypothesis Testing and Large Deviations, Properties of types, Sanov’s theorem, composite hypothesis tests, Stien’s lemma, likelihood ratio tests and error exponents.

Information Projection, Maximum Likelihood and the Geometry of Statistical Models, Minimization problems concerning information divergence, maximum likelihood estimation, exponential and linear families of probability distributions and the geometry associated with them. Cramer-Rao bound, Fisher information. Application to regression problems.

**Text/Reference Books:**

- I. Csiszar and J. Korner, “Information Theory: Coding Theorems for Discrete Memoryless Systems”, Cambridge University Press, ISBN 978-1-107-56504-3
- I. Csiszar and P. C. Shields, "Information Theory and Statistics: A Tutorial", NOW Publishers, ISBN-13: 978-1933019055
- Thomas M. Cover, Joy A. Thomas, "Elements of Information Theory", John Wiley & Sons, ISBN-13:978-0471241959
- S. Kullback, "Information Theory and Statistics”, Dover Publications, ISBN-13:978-0486696843