Difference between revisions of "10-601 DR"
From Cohen Courses
Jump to navigationJump to search(6 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
+ | This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]] | ||
+ | |||
+ | |||
=== Slides === | === Slides === | ||
− | [http://curtis.ml.cmu.edu/w/courses/images/ | + | [http://curtis.ml.cmu.edu/w/courses/images/a/a7/Lecture14-clustering.pdf Slides in PDF] |
=== Readings === | === Readings === | ||
− | + | Bishop's Chapter 9 | |
− | + | ||
+ | Mitchell 6.12 also has a nice description of EM and k-means. | ||
+ | |||
+ | === What You Should Know Afterward === | ||
− | * | + | * Linkage methods |
− | ** | + | * Bottom up vs. top down clustering |
+ | * Run time of different clustering methods | ||
+ | * Assumptions, advntages and disadvantages of each method | ||
+ | * (optional) Optimal leaf ordering | ||
+ | * (optional) Spectral clustering | ||
+ | * (optional) Self organizing maps |
Latest revision as of 10:08, 12 August 2014
This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
Readings
Bishop's Chapter 9
Mitchell 6.12 also has a nice description of EM and k-means.
What You Should Know Afterward
- Linkage methods
- Bottom up vs. top down clustering
- Run time of different clustering methods
- Assumptions, advntages and disadvantages of each method
- (optional) Optimal leaf ordering
- (optional) Spectral clustering
- (optional) Self organizing maps