Difference between revisions of "10-601 Classification and K-NN"

From Cohen Courses
Jump to navigationJump to search
 
(3 intermediate revisions by 2 users not shown)
Line 3: Line 3:
 
=== Slides ===
 
=== Slides ===
  
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/classfication.pdf Slides in pdf].
+
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/classification.pdf Slides in pdf].
  
* William's lecture (draft): [http://www.cs.cmu.edu/~wcohen/10-601/classification-and-knn.pptx Slides in Powerpoint].
+
* William's lecture: [http://www.cs.cmu.edu/~wcohen/10-601/classification-and-knn.pptx Slides in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/classification-and-knn.pdf in PDF].
  
 
=== Readings ===
 
=== Readings ===
Line 14: Line 14:
  
 
* What is the goal of classification
 
* What is the goal of classification
* Bayes decision boundary for classification
+
* What sorts of problems can be solved by reducing them to classification
* Is there an optimal classifier?
 
 
* What the K-NN algorithm is.
 
* What the K-NN algorithm is.
* ''What the computational properties of eager vs lazy learning are in general, and K-NN in specific.''
 
* ''What decision boundary is defined by K-NN, and how it compares to decision boundaries of linear classifiers.''
 
** ''Ziv - shouldn't we move these till after we've introduced an eager learner and a linear classifier?'' --[[User:Wcohen|Wcohen]] ([[User talk:Wcohen|talk]]) 13:35, 15 August 2014 (EDT)
 
 
* How the value of K affects the tendency of K-NN to overfit or underfit data.
 
* How the value of K affects the tendency of K-NN to overfit or underfit data.
* (optional) probabilistic interpretation of KNN decisions
+
 
 +
* Optional:
 +
** Bayes decision boundary for classification
 +
** Is there an optimal classifier?
 +
** What decision boundary is defined by K-NN.
 +
** Probabilistic interpretation of KNN decisions

Latest revision as of 13:44, 17 September 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Readings

  • Mitchell, Chapters 1,2 and 8.

What You Should Know Afterward

  • What is the goal of classification
  • What sorts of problems can be solved by reducing them to classification
  • What the K-NN algorithm is.
  • How the value of K affects the tendency of K-NN to overfit or underfit data.
  • Optional:
    • Bayes decision boundary for classification
    • Is there an optimal classifier?
    • What decision boundary is defined by K-NN.
    • Probabilistic interpretation of KNN decisions