10-601 K-NN And Trees - Lecture from Fall 2013
From Cohen Courses
Revision as of 17:01, 25 September 2013 by Wcohen (talk | contribs) (→What You Should Know Afterward)
This a lecture used in the Syllabus for Machine Learning 10-601
Slides
Readings
- Mitchell, Chapter 3.
What You Should Know Afterward
- What a decision tree is, and how to use a tree to classify an example.
- What decision boundary is defined by a decision tree, and how it compares to decision boundaries of linear classifiers.
- Algorithmically, how decision trees are built using a divide-and-conquer method.
- What entropy is, what information gain is, and why they are useful in decision tree learning.
- What decision tree pruning is, and how it interacts with overfitting data.
- What the K-NN algorithm is.
- What the computational properties of eager vs lazy learning are in general, and K-NN in specific.
- What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.
- How the value of K affects the tendency of K-NN to overfit or underfit data.