Euclidean Distance 
Euclidean Distance
by Harrison Kinsley
Video Lecture 15 of 42
Not yet rated
Views: 492
Date Added: August 11, 2016

Lecture Description

In the previous tutorial, we covered how to use the K Nearest Neighbors algorithm via Scikit-Learn to achieve 95% accuracy in predicting benign vs malignant tumors based on tumor attributes. Now, we're going to dig into how K Nearest Neighbors works so we have a full understanding of the algorithm itself, to better understand when it will and wont work for us.
We will come back to our breast cancer dataset, using it on our custom-made K Nearest Neighbors algorithm and compare it to Scikit-Learn's, but we're going to start off with some very simple data first. K Nearest Neighbors boils down to proximity, not by group, but by individual points. Thus, all this algorithm is actually doing is computing distance between points, and then picking the most popular class of the top K classes of points nearest to it. There are various ways to compute distance on a plane, many of which you can use here, but the most accepted version is Euclidean Distance, named after Euclid, a famous mathematician who is popularly referred to as the father of Geometry, and he definitely wrote the book (The Elements) on it.

pythonprogramming.net
twitter.com/sentdex
www.facebook.com/pythonprogramming.net/
plus.google.com/+sentdex

Course Index

Course Description

The objective of this course is to give you a holistic understanding of machine learning, covering theory, application, and inner workings of supervised, unsupervised, and deep learning algorithms.

In this series, we'll be covering linear regression, K Nearest Neighbors, Support Vector Machines (SVM), flat clustering, hierarchical clustering, and neural networks.

Comments

There are no comments. Be the first to post one.
  Post comment as a guest user.
Click to login or register:
Your name:
Your email:
(will not appear)
Your comment:
(max. 1000 characters)
Are you human? (Sorry)