Difference between revisions of "10-601 DR"

From Cohen Courses
Jump to navigationJump to search
(Created page with "=== Slides === [http://curtis.ml.cmu.edu/w/courses/images/8/8d/Lecture10-pac.pdf Slides in PDF] [http://curtis.ml.cmu.edu/w/courses/images/0/0f/Lecture10-pac-annotated.pdf A...")
 
 
(10 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 +
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
 +
 +
 
=== Slides ===
 
=== Slides ===
  
[http://curtis.ml.cmu.edu/w/courses/images/8/8d/Lecture10-pac.pdf Slides in PDF]
+
[http://curtis.ml.cmu.edu/w/courses/images/a/a7/Lecture14-clustering.pdf Slides in PDF]
 +
 
 +
=== Readings ===
 +
 
 +
Bishop's Chapter 9
  
[http://curtis.ml.cmu.edu/w/courses/images/0/0f/Lecture10-pac-annotated.pdf Annotated Slides in PDF]
+
Mitchell 6.12 also has a nice description of EM and k-means.
  
=== Readings ===
+
=== What You Should Know Afterward ===
  
the PAC learning, VC dimension, etc. in Tom Mitchell 's Chapter 7
+
* Linkage methods
 +
* Bottom up vs. top down clustering
 +
* Run time of different clustering methods
 +
* Assumptions, advntages and disadvantages of each method
 +
* (optional) Optimal leaf ordering
 +
* (optional) Spectral clustering
 +
* (optional) Self organizing maps

Latest revision as of 09:08, 12 August 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014


Slides

Slides in PDF

Readings

Bishop's Chapter 9

Mitchell 6.12 also has a nice description of EM and k-means.

What You Should Know Afterward

  • Linkage methods
  • Bottom up vs. top down clustering
  • Run time of different clustering methods
  • Assumptions, advntages and disadvantages of each method
  • (optional) Optimal leaf ordering
  • (optional) Spectral clustering
  • (optional) Self organizing maps