Lecture Description
This lecture covers some history of digital communication, with a focus on Samuel Morse and Claude Shannon, measuring information and defining information, the significance of entropy on encodings, and Huffman's coding algorithm.
Course Index
- Overview: Information and Entropy
- Compression: Huffman and LZW
- Errors, channel codes
- Linear block codes, parity relations
- Error correction, syndrome decoding
- Convolutional codes
- Viterbi decoding
- Noise
- Transmitting on a physical channel
- Linear time-invariant (LTI) systems
- LTI channel and intersymbol interference
- Filters and composition
- Frequency response of LTI systems
- Spectral representation of signals
- Modulation/demodulation
- More on modulation/demodulation
- Packet switching
- MAC protocols
- Network routing (without failures)
- Network routing (with failures)
- Reliable transport
- Sliding window analysis, Little's law
- A brief history of the Internet
- History of the Internet cont'd, course summary
Course Description
An introduction to several fundamental ideas in electrical engineering and computer science, using digital communication systems as the vehicle. The three parts of the course—bits, signals, and packets—cover three corresponding layers of abstraction that form the basis of communication systems like the Internet.
The course, taught by Prof. George Verghese, teaches ideas that are useful in other parts of EECS: abstraction, probabilistic analysis, superposition, time and frequency-domain representations, system design principles and trade-offs, and centralized and distributed algorithms. The course emphasizes connections between theoretical concepts and practice using programming tasks and some experiments with real-world communication channels.