Difference between revisions of "10-601 Matrix Factorization"

From Cohen Courses
Jump to navigationJump to search
Line 3: Line 3:
 
=== Slides ===
 
=== Slides ===
  
* [http://www.cs.cmu.edu/~wcohen/10-601/pca+mf.ppt Slides in PowerPoint], [http://www.cs.cmu.edu/~wcohen/10-601/pca+mf.pdf  in PDF].
+
* [http://www.cs.cmu.edu/~wcohen/10-601/pca+mf.pdf  Slides in PDF].
  
 
=== Readings ===
 
=== Readings ===

Revision as of 16:43, 11 April 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

Summary

You should know:

  • What PCA is, and how it relates to matrix factorization.
  • What loss function and constraints are associated with PCA - i.e., what the "PCA Problem" is.
  • How to interpret the low-dimensional embedding of instances, and the "prototypes" produced by PCA and MF techniques.
    • How to interpret the prototypes in the case of dimension reduction for images.
    • How to interpret the prototypes in the case of collaborative filtering, and completion of a ratings matrix.
  • How PCA and MF relate to k-means and and EM.
  • The differences/similarities between PCA and SVD.
  • The connection between SVD and LSI.