10-601 K-NN And Trees - Lecture from Fall 2013
From Cohen Courses
Jump to navigationJump to searchThis a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
Readings
- Mitchell, Chapter 3.
What You Should Know Afterward
- What a decision tree is, and how to use a tree to classify an example.
- What decision boundary is defined by a decision tree, and how it compares to decision boundaries of linear classifiers.
- Algorithmically, how decision trees are built using a divide-and-conquer method.
- What entropy is, what information gain is, and why they are useful in decision tree learning.
- What decision tree pruning is, and how it interacts with overfitting data.
- What the K-NN algorithm is.
- What the computational properties of eager vs lazy learning are in general, and K-NN in specific.
- What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.
- How the value of K affects the tendency of K-NN to overfit or underfit data.