Difference between revisions of "10-601 K-NN And Trees - Lecture from Fall 2013"

From Cohen Courses
Jump to navigationJump to search
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
+
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2013]]
  
 
=== Slides ===
 
=== Slides ===
  
* [http://www.cs.cmu.edu/~wcohen/10-601/decision-trees.pptx Slides in Powerpoint].
+
* [http://www.cs.cmu.edu/~wcohen/10-601/classification-and-decision-trees.pptx Slides in Powerpoint].
  
 
=== Readings ===
 
=== Readings ===

Latest revision as of 13:19, 15 August 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2013

Slides

Readings

  • Mitchell, Chapter 3.

What You Should Know Afterward

  • What is the goal of classification
  • Bayes decision boundary for classification
  • Is there an optimal classifier?
  • What the K-NN algorithm is.
  • What the computational properties of eager vs lazy learning are in general, and K-NN in specific.
  • What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.
  • How the value of K affects the tendency of K-NN to overfit or underfit data.
  • (optional) probabilistic interpretation of KNN decisions