CSE 392 : Information Theory & Communication

Instructor: Ritwik Banerjee
Lecture Schedule: Monday and Wednesday 4:00 - 5:20 PM @ Earth & Space Sciences 079
Office Hours: Monday 11:30 AM - 12:30 PM, and Friday 2:30 PM - 3:30 PM @ New CS 206

Information Theory is one of the very few fields in computer science that have a well-defined "beginning", originating with Claude Shannon's famous paper, A Mathematical Theory of Communication, in 1948. For a few decades, this field remained in relative obscurity, until results from information theory started to play a fundamentally important role in artificial intelligence, cryptography, computer security, data processing and storage, etc.

This course will introduce the basic concepts of information theory, and then dive into some important applications of the theory. Towards the end of the course, students will select a particular topic from a given list and participate in seminar-style presentations.

There are no exams in this course.

No formal prerequisites, but students are expected to be comfortable with probability theory, algebra and discrete mathematics.

Reference Book

Elements of Information Theory, 2nd Edition
Authors: T. M. Cover, J. A. Thomas
Publisher: Wiley
ISBN: 0-471-24195-4

Additional or supplementary reading material will be provided from related research papers and surveys.

Topics Covered

The following is a tentative (and potentially non-exhaustive) list of topics that will be covered in this course:

  • Entropy, Conditional Entropy, Relative Entropy
  • Mutual Information
  • Compression
  • Entropy and Counting (e.g., sums of binomial coefficients)
  • Communication Complexity
  • Amortized Complexity
  • Data Structure Bounds
  • Streaming Algorithm Bounds
  • Graph Entropy and Sorting
  • Applications in AI
  • Pseudorandom generation
  • Data processing and communication in noisy channels
Homework Assignments

There will be a total of four assignments, two theory and two programming.

Each student will get a total of 4 days throughout the semester for late submission. Once this 'delay quota' is full, however, no late submission will be accepted.

Grading Scheme
4 Homeworks 60% (15% each)
Scribing Lectures 25%
Presentation 15%

The final grade you receive in this class will reflect, as far as possible, the extent to which your assignments, scribes, and presentation reflect how much you have mastered the concepts and their applications. How much someone needs a grade, or how close they are to the next higher grade, will have no effect on grade. As the instructor, I want everyone to do well in this course, and will make every reasonable effort to help you understand the material taught. However, the grades provided at the end of the semester are final, except for rare situations involving grading errors. They will not be altered for any reason, so please do not ask me to do so.

You may discuss the homework in this course with anyone you like, however each student's submission, including written material and coding, must be his or her own work, and only his or her own work. Any evidence that written homework submissions or source code have been copied, shared, or transmitted in any way between students (this includes using source code downloaded from the Internet or written by others in previous semesters!) will be regarded as evidence of academic dishonesty. Additionally, any evidence of sharing of information or using unauthorized information during an examination will also be regarded as evidence of academic dishonesty.

Each student must pursue his or her academic goals honestly and be personally accountable for all submitted work. Representing another person's work as your own is always wrong. Faculty are required to report any suspected instances of academic dishonesty to the Academic Judiciary. Faculty in the Health Sciences Center (School of Health Technology & Management, Nursing, Social Welfare, Dental Medicine) and School of Medicine are required to follow their school-specific procedures. For more comprehensive information on academic integrity, including categories of academic dishonesty, please refer to the academic judiciary website at http://www.stonybrook.edu/commcms/academic_integrity/index.html

If you have a physical, psychological, medical or learning disability that may impact your course work, please contact Disability Support Services, ECC (Educational Communications Center) Building, room 128, (631) 632-6748. They will determine with you what accommodations, if any, are necessary and appropriate. All information and documentation is confidential. Students who require assistance during emergency evacuation are encouraged to discuss their needs with their professors and Disability Support Services. For procedures and information go to the following website: http://www.stonybrook.edu/ehs/fire/disabilities

Stony Brook University expects students to respect the rights, privileges, and property of other people. Faculty are required to report to the Office of Judicial Affairs any disruptivebehavior that interrupts their ability to teach, compromises the safety of the learning environment, or inhibits students' ability to learn. Faculty in the HSC Schools and the School of Medicine are required to follow their school-specific procedures.

All reading material references "Elements of Information Theory" by Cover and Thomas, unless specified otherwise.

Topic(s) Reading
Lecture 0 Syllabus and Introduction (no slides)
Lecture 1 Entropy
Lecture notes: (.pdf) and (.tex)
Ch. 1 (introduction) and §2.1, §2.2
Lecture 2 Chain Rules, Divergence, and Jensen's Inequality §2.3 – §2.6
Lecture 3 Log Sum Inequality and its consquences §2.3 – §2.7
Lecture 4 Error and Conditional Entropy §2.8 – §2.10
Lecture 5 Connections to AI and Machine Learning and Decision Trees
Lecture 6 Decision Trees (ctd.)
Lecture 7 Asymptotic Equipartition Property §3.1
Lecture 8 Data Compression Overview §3.2 and §3.3
Lecture 9 Entropy and Counting - I
Lecture 10 Entropy and Counting - II
Lecture 11 Stochastic Processes §4.1 – §4.2
Lecture 12 Stochastic Processes - II §4.3 – §4.5
Lecture 13 Data Compression - II §5.1 – §5.2
Lecture 14 Data Compression - III §5.3 – §5.4
Lecture 15 Huffman Codes §5.5 – §5.8
Lecture 16 Shannon-Fano-Elias Code and Arithmetic Code §5.9, §5.10, §13.3
Lecture 17 Channel Capacity: Types of Channels §7.1
Lecture 18 Introduction to Shannon's 2nd Theorem §7.2 – §7.5
Lecture 19 Joint Typicality §7.6
Lecture 20 Channel Coding Theorem §7.7
Lecture 21 Zero-Error codes and Hamming Codes
Submission Deadline
Homework 1


  • IV. (a): The d should be a  
  • VI. (c): for all n  
  • VI. (d): for sufficiently large n  

Friday, Oct 6 (11:59 PM)
Homework 2 Friday, Oct 27 (11:59 PM)
Homework 3 Friday, Nov 24 (11:59 PM)
Homework 4 Sunday, Dec 10 (11:59 PM)