Difference between revisions of "10-601 K-NN And Trees - Lecture from Fall 2013"
From Cohen Courses
Jump to navigationJump to searchLine 11: | Line 11: | ||
=== What You Should Know Afterward === | === What You Should Know Afterward === | ||
− | * What | + | * What is the goal of classification |
− | * | + | * Bayes decision boundary for classification |
− | * | + | * Is there an optimal classifier? |
− | |||
− | |||
− | |||
* What the K-NN algorithm is. | * What the K-NN algorithm is. | ||
* What the computational properties of eager vs lazy learning are in general, and K-NN in specific. | * What the computational properties of eager vs lazy learning are in general, and K-NN in specific. | ||
* What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers. | * What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers. | ||
* How the value of K affects the tendency of K-NN to overfit or underfit data. | * How the value of K affects the tendency of K-NN to overfit or underfit data. | ||
+ | * (optional) probabilistic interpretation of KNN decisions |
Revision as of 08:35, 12 August 2014
This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
Readings
- Mitchell, Chapter 3.
What You Should Know Afterward
- What is the goal of classification
- Bayes decision boundary for classification
- Is there an optimal classifier?
- What the K-NN algorithm is.
- What the computational properties of eager vs lazy learning are in general, and K-NN in specific.
- What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.
- How the value of K affects the tendency of K-NN to overfit or underfit data.
- (optional) probabilistic interpretation of KNN decisions