Difference between revisions of "10-601 Classification and K-NN"
From Cohen Courses
Jump to navigationJump to search (→Slides) |
|||
Line 5: | Line 5: | ||
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/classification.pdf Slides in pdf]. | * Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/classification.pdf Slides in pdf]. | ||
− | * William's lecture | + | * William's lecture: [http://www.cs.cmu.edu/~wcohen/10-601/classification-and-knn.pptx Slides in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/classification-and-knn.pdf in PDF]. |
=== Readings === | === Readings === |
Latest revision as of 13:44, 17 September 2014
This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
- Ziv's lecture: Slides in pdf.
- William's lecture: Slides in Powerpoint, in PDF.
Readings
- Mitchell, Chapters 1,2 and 8.
What You Should Know Afterward
- What is the goal of classification
- What sorts of problems can be solved by reducing them to classification
- What the K-NN algorithm is.
- How the value of K affects the tendency of K-NN to overfit or underfit data.
- Optional:
- Bayes decision boundary for classification
- Is there an optimal classifier?
- What decision boundary is defined by K-NN.
- Probabilistic interpretation of KNN decisions