Difference between revisions of "Syllabus for Machine Learning with Large Datasets 10-605 in Spring 2013"

From Cohen Courses
Jump to navigationJump to search
Line 58: Line 58:
 
* Mon Apr 15. [[Class meeting for 10-605 2013 04 15|Scaling up decision tree learning]]
 
* Mon Apr 15. [[Class meeting for 10-605 2013 04 15|Scaling up decision tree learning]]
 
** '''Project progress report due'''
 
** '''Project progress report due'''
* Wed Apr 17.  [[Class meeting for 10-605 2013 04 17|SGD for matrix factorization]]
+
* Wed Apr 17.  [[Class meeting for 10-605 2013 04 17|Gradient boosting with trees, and SGD for matrix factorization]]
 
** '''Assignment due: K-Means on MapReduce.'''
 
** '''Assignment due: K-Means on MapReduce.'''
 
** ''New Assignment: Multi-class image classification or scalable classification using a linear classifier.''  Both of these count as one assignment toward your six.
 
** ''New Assignment: Multi-class image classification or scalable classification using a linear classifier.''  Both of these count as one assignment toward your six.

Revision as of 15:38, 17 April 2013

This is the syllabus for Machine Learning with Large Datasets 10-605 in Spring 2013.

January

February

March

April and May

May

  • 9am, Tuesday, May 7. Project writeups due. Submit a paper to Blackbook in PDF in the ICML 2013 format (minimum 5 pp, up to 8pp double column), except, of course, do not submit it anonymously.
    • Note: this is extended from previous deadline of Fri May 3---but I can't give any further extensions! Your project report should discuss
      • The problem you're trying to solve, and why it's important and/or interesting.
      • Related work, especially any related work that you're building on.
      • The data that you're working with.
      • The methods that you're using (in some detail - even if these are off-the-shelf methods, I want to know that you understand them)
      • The experiments you did, the metrics you used to evaluate them, and the results.
      • What was learned from the experiments (the conclusions).
    • You should think of this as an exercise in writing a conference-style paper: so try and write in that style. (Of course, your work doesn't need to advance the state-of-the-art in machine learning, or be highly novel, but it should be well-described.)