Difference between revisions of "10-601 DR"

From Cohen Courses
Jump to navigationJump to search
 
(3 intermediate revisions by one other user not shown)
Line 1: Line 1:
 +
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
 +
 +
 
=== Slides ===
 
=== Slides ===
  
[http://curtis.ml.cmu.edu/w/courses/images/6/6f/Lecture15-DR.pdf Slides in PDF]
+
[http://curtis.ml.cmu.edu/w/courses/images/a/a7/Lecture14-clustering.pdf Slides in PDF]
  
 
=== Readings ===
 
=== Readings ===
  
* PCA
+
Bishop's Chapter 9
** Bishop's 12.1
+
 
* Manifold learning
+
Mitchell 6.12 also has a nice description of EM and k-means.
** ISOMAP [http://web.mit.edu/cocosci/Papers/sci_reprint.pdf A Global Geometric Framework for Nonlinear Dimensionality Reduction]
+
 
** LLE [http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=C34DA2B1646F1DE3B4B4CE4DCB21E317?doi=10.1.1.111.3313&rep=rep1&type=pdf Nonlinear Dimensionality Reduction by Locally Linear Embedding]
+
=== What You Should Know Afterward ===
 +
 
 +
* Linkage methods
 +
* Bottom up vs. top down clustering
 +
* Run time of different clustering methods
 +
* Assumptions, advntages and disadvantages of each method
 +
* (optional) Optimal leaf ordering
 +
* (optional) Spectral clustering
 +
* (optional) Self organizing maps

Latest revision as of 10:08, 12 August 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014


Slides

Slides in PDF

Readings

Bishop's Chapter 9

Mitchell 6.12 also has a nice description of EM and k-means.

What You Should Know Afterward

  • Linkage methods
  • Bottom up vs. top down clustering
  • Run time of different clustering methods
  • Assumptions, advntages and disadvantages of each method
  • (optional) Optimal leaf ordering
  • (optional) Spectral clustering
  • (optional) Self organizing maps