Information theory 1000-2N03TI
1.From the 20 questions game to the concept of entropy. Kraft inequality. Codes of Huffman and Shannon-Fano.
2.Conditional entropy, information.
3.The First Shannon Theorem about optimal encoding.
4.Channels, information lost, improving efficiency, channel capacity.
5.The Shannon Theorem about sending information through a noisy channel.
6.Information complexity by Kolmogorov. Chaitin number.
7.Kolmogorov's complexity vs.Shannon's entropy - universal test by Martin Loef.
The course will be given in Polish, if no non-polish speaking students register for it.
Course coordinators
Type of course
Learning outcomes
Knowledge
Student
-- understands basic concepts of information theory: entropy, mutual information, communication channel, channel capacity, Kolmogorov complexity,
-- understands theoretical barriers to the efficiency of information encoding and reliable communication,
-- understands the background of algorithms for text compression and error correction.
Skills
Students is able
-- to apply the concept of entropy in data analysis,
-- to compute the capacity of an information channel,
-- to apply in practice algorithms of compression and
error correction.
Competence
Student understands the mathematical aspect of the concept of information and is able to use it in system engineering, in particular to design networks and distributed systems, to manage memory etc., as well as in applications of informatics, in particular in data analysis, cryptography, and bioinformatics.
Assessment criteria
The final grade is based on systematic work during the term, results of a mid-term class test and results of the written exam.
Bibliography
1. "Information and Coding Theory" by Gareth A. Jones and J. Mary Jones, Springer, 2000.
2. "Elements of Information Theory" by Thomas M. Cover and Joy A. Thomas,
Wiley Series in Telecommunications, 1991.