Difference between revisions of "10-601B Clustering"

From Cohen Courses
Jump to navigationJump to search
 
(One intermediate revision by the same user not shown)
Line 3: Line 3:
 
=== Slides ===
 
=== Slides ===
  
* [http://curtis.ml.cmu.edu/w/courses/images/5/5d/Clustering.pdf Slides in pdf ]
+
* [http://curtis.ml.cmu.edu/w/courses/images/5/5d/Clustering.pdf Slides in pdf]
 +
* [http://curtis.ml.cmu.edu/w/courses/images/f/fb/Clustering.pptx Slides in ppt]
  
 
=== Readings ===
 
=== Readings ===
Line 11: Line 12:
 
=== What You Should Know Afterward ===
 
=== What You Should Know Afterward ===
  
** Partitional Clustering. k-means and k-means ++
+
*Partitional Clustering. k-means and k-means ++
* Lloyd’s method
+
** Lloyd’s method
* Initialization techniques (random, furthest traversal, k-means++)
+
** Initialization techniques (random, furthest traversal, k-means++)
 
 
  
 
* Hierarchical Clustering.
 
* Hierarchical Clustering.
 
** Single linkage, Complete linkage
 
** Single linkage, Complete linkage

Latest revision as of 11:24, 8 March 2016

This a pair of lectures used in the Syllabus for Machine Learning 10-601B in Spring 2016.

Slides

Readings

  • Murphy 25.5

What You Should Know Afterward

  • Partitional Clustering. k-means and k-means ++
    • Lloyd’s method
    • Initialization techniques (random, furthest traversal, k-means++)
  • Hierarchical Clustering.
    • Single linkage, Complete linkage