Difference between revisions of "Syllabus for Machine Learning 10-601B in Spring 2016"
From Cohen Courses
Jump to navigationJump to searchLine 44: | Line 44: | ||
| W 2/17 || [[10-601 PAC| Theory 1]] || Nina || || HW4: Theory | | W 2/17 || [[10-601 PAC| Theory 1]] || Nina || || HW4: Theory | ||
|- | |- | ||
− | | M 2/22 | | + | | M 2/22 || Theory 2|| Nina || || |
|- | |- | ||
− | | W 2/24|| | + | | W 2/24|| Midterm Review || Nina || || |
|- | |- | ||
− | | M 2/29 || Midterm exam | + | | M 2/29 ||colspan="4"| ''Midterm exam'' |
|- | |- | ||
| W 3/2 || [[10-601 Clustering| Unsupervised Learning: k-Means and Mixtures]] || Nina || || | | W 3/2 || [[10-601 Clustering| Unsupervised Learning: k-Means and Mixtures]] || Nina || || |
Revision as of 16:47, 6 January 2016
This is the syllabus for Machine Learning 10-601 in Spring 2016.
Schedule
In progress....
Teaching team: also see the Google Doc Spreadsheet
Date | Main Topic of Lecture | Lecturer | Assignment | TAs |
---|---|---|---|---|
M 1/11 | Course Overview | Nina | ||
W 1/13 | Intro to Probability | William | HW1 will be similar to this | Will, Han |
M 1/18 | Martin Luther King Day | |||
W 1/20 | The Naive Bayes algorithm | William | HW2: implementing naive Bayes | Travis, Maria |
M 1/25 | Logistic Regression | William | ||
W 1/27 | Perceptrons and SVMs | William | ||
M 2/1 | Kernels | Nina | ||
W 2/3 | Linear Regression | Nina | HW3: implementing logistic regression and kernel perceptrons | Tianshu, Will |
M 2/8 | Neural Networks 1 | Nina | ||
W 2/10 | Neural networks 2 | Nina | ||
M 2/15 | Boosting and Other Ensembles | Nina | ||
W 2/17 | Theory 1 | Nina | HW4: Theory | |
M 2/22 | Theory 2 | Nina | ||
W 2/24 | Midterm Review | Nina | ||
M 2/29 | Midterm exam | |||
W 3/2 | Unsupervised Learning: k-Means and Mixtures | Nina | ||
M 3/7 | Semi-Supervised Learning | Nina | ||
W 3/9 | 10-601 Active Learning | Nina | HW5: Active learning and clustering | Travis, Han |
M 3/14 | Spring break | |||
W 3/16 | Spring break | |||
M 3/21 | Graphical Models 1 | William | ||
W 3/23 | Graphical Models 2 | William | ||
M 3/28 | Graphical Models 3 | William | HW6: Graphical models (Maria, Renato) | |
M 4/4 | Topic Models | William | ||
W 4/6 | Dimension Reduction | William | ||
M 4/11 | Collaborative Filtering and Matrix Factorization | William | ||
W 4/13 | Deep Learning 1 | William | ||
M 4/18 | Deep Learning 2 | William | ||
W 4/20 | Reinforcement Learning | Nina | ||
M 4/22 | Review | Nina and William | ||
W 4/24 | Final exam |
To other instructors: if you'd like to use any of the materials found here, you're absolutely welcome to do so, but please acknowledge their ultimate source somewhere.
Section-by-Section
Linear Classifiers
A probabilistic view of linear classification:
Another view of classification:
- 10-601 Introduction to Linear Algebra
- 10-601 Perceptrons and Voted Perceptrons
- 10-601 Voted Perceptrons and Support Vector Machines
Summary: