Difference between revisions of "10-601 Classification and K-NN"

From Cohen Courses
Jump to navigationJump to search
Line 16: Line 16:
 
* Is there an optimal classifier?
 
* Is there an optimal classifier?
 
* What the K-NN algorithm is.
 
* What the K-NN algorithm is.
* What the computational properties of eager vs lazy learning are in general, and K-NN in specific.
+
* ''What the computational properties of eager vs lazy learning are in general, and K-NN in specific.''
* What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.
+
* ''What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.''
 +
** ''Ziv - shouldn't we move these till after we've introduced an eager learner and a linear classifier?'' --[[User:Wcohen|Wcohen]] ([[User talk:Wcohen|talk]]) 13:35, 15 August 2014 (EDT)
 
* How the value of K affects the tendency of K-NN to overfit or underfit data.
 
* How the value of K affects the tendency of K-NN to overfit or underfit data.
 
* (optional) probabilistic interpretation of KNN decisions
 
* (optional) probabilistic interpretation of KNN decisions

Revision as of 12:35, 15 August 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Readings

  • Mitchell, Chapters 1,2 and 8.

What You Should Know Afterward

  • What is the goal of classification
  • Bayes decision boundary for classification
  • Is there an optimal classifier?
  • What the K-NN algorithm is.
  • What the computational properties of eager vs lazy learning are in general, and K-NN in specific.
  • What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.
    • Ziv - shouldn't we move these till after we've introduced an eager learner and a linear classifier? --Wcohen (talk) 13:35, 15 August 2014 (EDT)
  • How the value of K affects the tendency of K-NN to overfit or underfit data.
  • (optional) probabilistic interpretation of KNN decisions