Introduction to Machine Learning
Video Lectures
Displaying all 36 video lectures.
Lecture 1![]() Play Video |
The Learning Problem (Analysis) Topics: Introduction; supervised, unsupervised, and reinforcement learning. Components of the learning problem. This lecture was recorded on April 3, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 1![]() Play Video |
The Learning Problem (Analysis) Topics: Introduction; supervised, unsupervised, and reinforcement learning. Components of the learning problem. This lecture was recorded on April 3, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 2![]() Play Video |
Is Learning Feasible? (Theory) Topics: Can we generalize from a limited sample to the entire space? Relationship between in-sample and out-of-sample. This lecture was recorded on April 5, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 2![]() Play Video |
Is Learning Feasible? (Theory) Topics: Can we generalize from a limited sample to the entire space? Relationship between in-sample and out-of-sample. This lecture was recorded on April 5, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 3![]() Play Video |
The Linear Model (Technique) Topics: Linear classification and linear regression. Extending linear models through nonlinear transforms. This lecture was recorded on April 10, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 3![]() Play Video |
The Linear Model (Technique) Topics: Linear classification and linear regression. Extending linear models through nonlinear transforms. This lecture was recorded on April 10, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 4![]() Play Video |
Error Measures and Noise (Analysis) Topics: The principled choice of error measures. What happens when the target we want to learn is noisy. This lecture was recorded on April 12, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 4![]() Play Video |
Error Measures and Noise (Analysis) Topics: The principled choice of error measures. What happens when the target we want to learn is noisy. This lecture was recorded on April 12, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 5![]() Play Video |
Training versus Testing (Theory) Topics: The difference between training and testing in mathematical terms. What makes a learning model able to generalize? This lecture was recorded on April 17, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 5![]() Play Video |
Training versus Testing (Theory) Topics: The difference between training and testing in mathematical terms. What makes a learning model able to generalize? This lecture was recorded on April 17, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 6![]() Play Video |
Theory of Generalization Topics: How an infinite model can learn from a finite sample. The most important theoretical result in machine learning. Lecture 6 of 18 of Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa. This lecture was recorded on April 19, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 6![]() Play Video |
Theory of Generalization Topics: How an infinite model can learn from a finite sample. The most important theoretical result in machine learning. Lecture 6 of 18 of Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa. This lecture was recorded on April 19, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 7![]() Play Video |
The VC Dimension (Theory) Topics: A measure of what it takes a model to learn. Relationship to the number of parameters and degrees of freedom. This lecture was recorded on April 24, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 7![]() Play Video |
The VC Dimension (Theory) Topics: A measure of what it takes a model to learn. Relationship to the number of parameters and degrees of freedom. This lecture was recorded on April 24, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 8![]() Play Video |
Bias-Variance Tradeoff (Theory) Topics: Breaking down the learning performance into competing quantities. The learning curves. This lecture was recorded on April 26, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 8![]() Play Video |
Bias-Variance Tradeoff (Theory) Topics: Breaking down the learning performance into competing quantities. The learning curves. This lecture was recorded on April 26, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 9![]() Play Video |
The Linear Model II (Technique) Topics: More about linear models. Logistic regression, maximum likelihood, and gradient descent. This lecture was recorded on May 1, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 9![]() Play Video |
The Linear Model II (Technique) Topics: More about linear models. Logistic regression, maximum likelihood, and gradient descent. This lecture was recorded on May 1, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 10![]() Play Video |
Neural Networks (Technique) A biologically inspired model. The efficient backpropagation learning algorithm. Hidden layers. This lecture was recorded on May 3, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 10![]() Play Video |
Neural Networks (Technique) A biologically inspired model. The efficient backpropagation learning algorithm. Hidden layers. This lecture was recorded on May 3, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 11![]() Play Video |
Overfitting (Analysis) Fitting the data too well; fitting the noise. Deterministic noise versus stochastic noise. This lecture was recorded on May 8, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 11![]() Play Video |
Overfitting (Analysis) Fitting the data too well; fitting the noise. Deterministic noise versus stochastic noise. This lecture was recorded on May 8, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 12![]() Play Video |
Technique: Regularization Putting the brakes on fitting the noise. Hard and soft constraints. Augmented error and weight decay. This lecture was recorded on May 10, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 12![]() Play Video |
Technique: Regularization Putting the brakes on fitting the noise. Hard and soft constraints. Augmented error and weight decay. This lecture was recorded on May 10, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 13![]() Play Video |
Technique: Validation Taking a peek out of sample. Model selection and data contamination. Cross validation. This lecture was recorded on May 15, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 13![]() Play Video |
Technique: Validation Taking a peek out of sample. Model selection and data contamination. Cross validation. This lecture was recorded on May 15, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 14![]() Play Video |
Support Vector Machines (Theory and Technique) One of the most successful learning algorithms; getting a complex model at the price of a simple one. This lecture was recorded on May 17, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 14![]() Play Video |
Support Vector Machines (Theory and Technique) One of the most successful learning algorithms; getting a complex model at the price of a simple one. This lecture was recorded on May 17, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 15![]() Play Video |
Kernel Methods (Theory and Technique) Extending SVM to infinite-dimensional spaces using the kernel trick, and to non-separable data using soft margins. This lecture was recorded on May 22, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 15![]() Play Video |
Kernel Methods (Theory and Technique) Extending SVM to infinite-dimensional spaces using the kernel trick, and to non-separable data using soft margins. This lecture was recorded on May 22, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 16![]() Play Video |
Radial Basis Functions (Technique) An important learning model that connects several machine learning models and techniques. This lecture was recorded on May 24, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 16![]() Play Video |
Radial Basis Functions (Technique) An important learning model that connects several machine learning models and techniques. This lecture was recorded on May 24, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 17![]() Play Video |
Three Learning Principles (Analysis) Major pitfalls for machine learning practitioners; Occam's razor, sampling bias, and data snooping. This lecture was recorded on May 29, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 17![]() Play Video |
Three Learning Principles (Analysis) Major pitfalls for machine learning practitioners; Occam's razor, sampling bias, and data snooping. This lecture was recorded on May 29, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 18![]() Play Video |
Epilogue: The Map of Machine Learning (Analysis) The map of machine learning. Brief views of Bayesian learning and aggregation methods. This lecture was recorded on May 31, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |
Lecture 18![]() Play Video |
Epilogue: The Map of Machine Learning (Analysis) The map of machine learning. Brief views of Bayesian learning and aggregation methods. This lecture was recorded on May 31, 2012, in Hameetman Auditorium at Caltech, Pasadena, CA, USA. |