Statistics 110: Probability

Video Lectures

Displaying all 35 video lectures.
Lecture 1
Probability and Counting
Play Video
Probability and Counting
We introduce sample spaces and the naive definition of probability (we'll get to the non-naive definition later). To apply the naive definition, we need to be able to count. So we introduce the multiplication rule, binomial coefficients, and the sampling table (for sampling with/without replacement when order does/doesn't matter).
Lecture 2
Story Proofs, Axioms of Probability
Play Video
Story Proofs, Axioms of Probability
We fill in the "Bose-Einstein" entry of the sampling table, and discuss story proofs. For example, proving Vandermonde's identity with a story is easier and more insightful than going through a tedious algebraic derivation. We then introduce the axioms of probability.
Lecture 3
Birthday Problem, Properties of Probability
Play Video
Birthday Problem, Properties of Probability
We discuss the birthday problem (how many people do you need to have a 50% chance of there being 2 with the same birthday?), the matching problem (de Montmort), inclusion-exclusion, and properties of probability.
Lecture 4
Conditional Probability
Play Video
Conditional Probability
We introduce conditional probability, independence of events, and Bayes' rule.
Lecture 5
Conditioning Continued, Law of Total Probability
Play Video
Conditioning Continued, Law of Total Probability
We continue further with conditional probability, and discuss the law of total probability, the so-called prosecutor's fallacy, a disease testing example, and the crucial distinction between independence and conditional independence.
Lecture 6
Monty Hall, Simpson's Paradox
Play Video
Monty Hall, Simpson's Paradox
We show how conditional probability sheds light on two of the most famous puzzles in statistics, both of which are often counterintuitive (at first): the Monty Hall problem and Simpson's paradox.
Lecture 7
Gambler's Ruin and Random Variables
Play Video
Gambler's Ruin and Random Variables
We analyze the gambler's ruin problem, in which two gamblers bet with each other until one goes broke. We then introduce random variables, which are essential in statistics and for the rest of the course, and start on the Bernoulli and Binomial distributions.
Lecture 8
Random Variables and Their Distributions
Play Video
Random Variables and Their Distributions
Much of this course is about random variables and their distributions. The relationship between a random variable and its distribution can seem subtle but it is essential! We discuss distributions, cumulative distribution functions (CDFs), probability mass functions (PMFs), and the Hypergeometric distribution.
Lecture 9
Expectation, Indicator Random Variables, Linearity
Play Video
Expectation, Indicator Random Variables, Linearity
We discuss expected values and the meaning of means, and introduce some very useful tools for finding expected values: indicator r.v.s, linearity, and symmetry. The fundamental bridge connects probability and expectation. We also introduce the Geometric distribution.
Lecture 10
Expectation (Continued)
Play Video
Expectation (Continued)
We prove linearity of expectation, solve a Putnam problem, introduce the Negative Binomial distribution, and consider the St. Petersburg Paradox.
Lecture 11
The Poisson distribution
Play Video
The Poisson distribution
We introduce the Poisson distribution, which is arguably the most important discrete distribution in all of statistics. We explore its uses as an approximate distribution and its connections with the Binomial.
Lecture 12
Discrete vs. Continuous, the Uniform
Play Video
Discrete vs. Continuous, the Uniform
We compare discrete vs. continuous distributions, and discuss probability density functions (PDFs), variance, standard deviation, and the Uniform distribution.
Lecture 13
Normal Distribution
Play Video
Normal Distribution
We introduce the Normal distribution, which is the most famous, important, and widely-used distribution in all of statistics.
Lecture 14
Location, Scale, and LOTUS
Play Video
Location, Scale, and LOTUS
We discuss location and scale, and standardization. We also make a conscious effort to describe the Law of the Unconscious Statistician (LOTUS), and use it to obtain the variance of a Poisson.
Lecture 15
Midterm Review
Play Video
Midterm Review
We work through some extra examples, such as the coupon collector problem, an example of Universality of the Uniform, an example of LOTUS, and a Poisson process example.
Lecture 16
Exponential Distribution
Play Video
Exponential Distribution
We introduce the Exponential distribution, which is characterized by the memoryless property.
Lecture 17
Moment Generating Functions
Play Video
Moment Generating Functions
We introduce moment generating functions (MGFs), which have many uses in probability. We also discuss Laplace's rule of succession and the "hybrid" version of Bayes' rule.
Lecture 18
MGFs (Continued)
Play Video
MGFs (Continued)
We use MGFs to get moments of Exponential and Normal distributions, and to get the distribution of a sum of Poissons. We also start on joint distributions.
Lecture 19
Joint, Conditional, and Marginal Distributions
Play Video
Joint, Conditional, and Marginal Distributions
We discuss joint, conditional, and marginal distributions (continuing from Lecture 18), the 2-D LOTUS, the fact that E(XY)=E(X)E(Y) if X and Y are independent, the expected distance between 2 random points, and the chicken-egg problem.
Lecture 20
Multinomial and Cauchy
Play Video
Multinomial and Cauchy
We introduce the Multinomial distribution, which is arguably the most important multivariate discrete distribution, and discuss its story and some of its nice properties, such as being able to "lump" categories together. We also do an example with the Cauchy distribution.
Lecture 21
Covariance and Correlation
Play Video
Covariance and Correlation
We introduce covariance and correlation, and show how to obtain the variance of a sum, including the variance of a Hypergeometric random variable.
Lecture 22
Transformations and Convolutions
Play Video
Transformations and Convolutions
We discuss transformations of r.v.s (change of variables), the LogNormal distribution, and convolutions (sums). As a bonus, we show how in certain problems one can use probability to prove existence.
Lecture 23
Beta distribution
Play Video
Beta distribution
We introduce the Beta distribution and show how it is the conjugate prior for the Binomial, and discuss Bayes' billiards. Stephen Blyth then gives examples of how probability is used in finance.
Lecture 24
Gamma distribution and Poisson process
Play Video
Gamma distribution and Poisson process
We introduce the Gamma distribution and discuss the connection between the Gamma distribution and Poisson processes.
Lecture 25
Order Statistics and Conditional Expectation
Play Video
Order Statistics and Conditional Expectation
We show how Beta and Gamma are connected (via the bank-post office story), and introduce order statistics. We then start on conditional expectation, with a peek inside the Two Envelope Paradox.
Lecture 26
Conditional Expectation (Continued)
Play Video
Conditional Expectation (Continued)
We peek further into the Two Envelope Paradox, and continue to explore conditional expectation, while considering waiting for HT vs. waiting for HH, in flips of a fair coin.
Lecture 27
Conditional Expectation given an R.V.
Play Video
Conditional Expectation given an R.V.
We show how to think about a conditional expectation E(Y|X) of one r.v. given another r.v., and discuss key properties such as taking out what's known, Adam's Law, and Eve's Law, with examples.
Lecture 28
Inequalities
Play Video
Inequalities
We consider the sum of a random number of random variable (e.g., with customers in a store). We then introduce 4 useful inequalities: Cauchy-Schwarz, Jensen, Markov, and Chebyshev.
Lecture 29
Law of Large Numbers and Central Limit Theorem
Play Video
Law of Large Numbers and Central Limit Theorem
We introduce and prove versions of the Law of Large Numbers and Central Limit Theorem, which are two of the most famous and important theorems in all of statistics.
Lecture 30
Chi-Square, Student-t, Multivariate Normal
Play Video
Chi-Square, Student-t, Multivariate Normal
We introduce several important offshoots of the Normal: the Chi-Square, Student-t, and Multivariate Normal distributions.
Lecture 31
Markov Chains
Play Video
Markov Chains
We introduce Markov chains -- a very beautiful and very useful kind of stochastic process -- and discuss the Markov property, transition matrices, and stationary distributions.
Lecture 32
Markov Chains (Continued)
Play Video
Markov Chains (Continued)
We continue to explore Markov chains, and discuss irreducibility, recurrence and transience, reversibility, and random walk on an undirected network.
Lecture 33
Markov Chains Continued Further
Play Video
Markov Chains Continued Further
We continue to explore Markov chains, and show how Google PageRank can be understood in terms of a natural Markov chain on the web.
Lecture 34
Course Overview: A Look Ahead
Play Video
Course Overview: A Look Ahead
We look ahead to possible future courses in statistics, discussing a few out of a very large number of connections between Stat 110 and other statistics ideas and courses.
Lecture 35
The Soul of Statistics
Play Video
The Soul of Statistics
Joe Blitzstein teaches the popular statistics class Stat 110, which provides a comprehensive introduction to probability as a medium to understand statistics, science, risk, and randomness. It has grown to over 300 students per year at Harvard and over 200,000 subscribers on iTunes U. His main research interests are in statistical inference for complex networks, with applications to social science and public health. Personally, he enjoys playing chess and is ranked in the Expert range by the US Chess Federation (the 98th percentile of all tournament players), and is the faculty adviser for the Harvard Chess Club.