Draft schedule for 10-601B in Spring 2016
From Cohen Courses
Revision as of 09:21, 12 January 2016 by Wcohen (talk | contribs) (Wcohen moved page Also - a draft schedule for 10-601B to Draft schedule for 10-601B in Spring 2016)
This is NOT the syllabus for Machine Learning 10-601 in Spring 2016. It's just a draft....
Schedule
Date | Main Topic of Lecture | Lecturer | Assignment | TAs |
---|---|---|---|---|
M 1/11 | Course Overview | Nina | ||
W 1/13 | Intro to Probability | William | HW1 Probabilities | Will, Han |
M 1/18 | Martin Luther King Day | |||
W 1/20 | The Naive Bayes algorithm | William | HW2: implementing naive Bayes | Travis, Maria |
M 1/25 | Logistic Regression | William | ||
W 1/27 | Linear Regression | William | ||
M 2/1 | Perceptrons and SVMs | Nina | ||
W 2/3 | Kernels | Nina | HW3: implementing logistic regression and linear regression | Tianshu, Will |
M 2/8 | Neural Networks and Backprop | Nina | ||
W 2/10 | Decision Trees and Rules | Nina | ||
M 2/15 | Boosting and Other Ensembles | Nina | ||
W 2/17 | Theory 1 | Nina | HW4: Theory | Han, Tianshu |
M 2/22 | Theory 2 | Nina | ||
W 2/24 | Midterm Review | Nina | ||
M 2/29 | Midterm exam | |||
W 3/2 | Unsupervised Learning: k-Means and Mixtures | Nina | ||
M 3/7 | Semi-Supervised Learning | Nina | ||
W 3/9 | Active Learning | Nina | HW5: Active learning and clustering | Travis, Han |
M 3/14 | Spring break | |||
W 3/16 | Spring break | |||
M 3/21 | Graphical Models 1 | William | ||
W 3/23 | Graphical Models 2 | William | ||
M 3/28 | Graphical Models for Sequential Data | William | HW6: Graphical models | Maria, Renato |
M 4/4 | Topic Models | William | ||
W 4/6 | Deep Learning 1 | William | ||
M 4/11 | Deep Learning 2 | William | ||
W 4/13 | PCA and dimension reduction | William | HW7: Deep Learning | TBA |
M 4/18 | Matrix Factorization and collaborative filtering | William | ||
W 4/20 | Reinforcement Learning | Nina | ||
M 4/25 | Review | Nina and William | ||
W 4/27 | Final exam |
To other instructors: if you'd like to use any of the materials found here, you're absolutely welcome to do so, but please acknowledge their ultimate source somewhere.