Overview: Information and Entropy 
Overview: Information and Entropy
by MIT
Video Lecture 1 of 24
Copyright Information: Hari Balakrishnan, and George Verghese. 6.02 Introduction to EECS II: Digital Communication Systems, Fall 2012. (Massachusetts Institute of Technology: MIT OpenCourseWare), http://ocw.mit.edu (Accessed 2 Mar, 2015). License: Creative Commons BY-NC-SA
Not yet rated
Views: 1,815
Date Added: March 2, 2015

Lecture Description

This lecture covers some history of digital communication, with a focus on Samuel Morse and Claude Shannon, measuring information and defining information, the significance of entropy on encodings, and Huffman's coding algorithm.

Course Index

Course Description

An introduction to several fundamental ideas in electrical engineering and computer science, using digital communication systems as the vehicle. The three parts of the course—bits, signals, and packets—cover three corresponding layers of abstraction that form the basis of communication systems like the Internet.

The course, taught by Prof. George Verghese, teaches ideas that are useful in other parts of EECS: abstraction, probabilistic analysis, superposition, time and frequency-domain representations, system design principles and trade-offs, and centralized and distributed algorithms. The course emphasizes connections between theoretical concepts and practice using programming tasks and some experiments with real-world communication channels.

Comments

There are no comments. Be the first to post one.
  Post comment as a guest user.
Click to login or register:
Your name:
Your email:
(will not appear)
Your comment:
(max. 1000 characters)
Are you human? (Sorry)