Difference between revisions of "Syllabus for Machine Learning 10-601B in Spring 2016"

From Cohen Courses
Jump to navigationJump to search
 
(44 intermediate revisions by 6 users not shown)
Line 12: Line 12:
 
!  TAs
 
!  TAs
 
|-
 
|-
| M 1/11 || [http://www.cs.cmu.edu/~wcohen/10-601/nina/intro-ml-601.pdf Course Overview]  ||  Nina  || ||  
+
| M 1/11 || [[10-601 Course Overview|Course Overview]]  ||  Nina  || ||  
 
|-
 
|-
 
| W 1/13 || [[10-601 Introduction to Probability|Intro to Probability]] || William
 
| W 1/13 || [[10-601 Introduction to Probability|Intro to Probability]] || William
|| [http://curtis.ml.cmu.edu/w/courses/images/e/e7/Spring16_10601_HW1.pdf HW1 Probabilities]
+
|| [http://curtis.ml.cmu.edu/w/courses/images/8/88/Homework1.pdf HW1 Background Test]
 
|| Will, Han
 
|| Will, Han
 
|-
 
|-
Line 21: Line 21:
 
|-  
 
|-  
 
| W 1/20 || [[10-601 Naive Bayes|The Naive Bayes algorithm]] || William  
 
| W 1/20 || [[10-601 Naive Bayes|The Naive Bayes algorithm]] || William  
|| HW2: implementing naive Bayes || Travis, Maria
+
|| [http://curtis.ml.cmu.edu/w/courses/images/c/c0/10601b-s16-homework2.pdf HW2: implementing naive Bayes] || Travis, Maria
 
|-  
 
|-  
 
| M 1/25 ||  [[10-601 Logistic Regression|Logistic Regression]] || William ||  ||  
 
| M 1/25 ||  [[10-601 Logistic Regression|Logistic Regression]] || William ||  ||  
Line 27: Line 27:
 
| W 1/27 ||  [[10-601 Linear Regression|Linear Regression]] || William  || ||  
 
| W 1/27 ||  [[10-601 Linear Regression|Linear Regression]] || William  || ||  
 
|-
 
|-
| M 2/1 || [[10-601B Perceptrons and SVMs|Perceptrons and SVMs]]  || Nina ||  ||  
+
| M 2/1 || [[10-601B Perceptrons and Large Margin|Perceptrons and Large Margin]]  || Nina ||  ||  
 
|-
 
|-
| W 2/3 || [[10-601B Kernels|Kernels]]  || Nina || HW3: implementing logistic regression and linear regression || Tianshu, Will   
+
| W 2/3 || [[10-601B Kernels|Kernels]]  || Nina || [http://curtis.ml.cmu.edu/w/courses/images/6/6e/10601-homework-3.pdf HW3: logistic and linear regression]|| Tianshu, Will   
 
|-  
 
|-  
| M 2/8 || [[10-601B Neural_networks and Backprop|Neural Networks]] || Nina || ||
+
| M 2/8 || [[10-601B Kernelized SVMs | Kernelized SVMs]] and [[10-601B Intro to neural Networks | Intro to Neural Networks]] || Nina || ||
 
|-  
 
|-  
| W 2/10 || [[10-601B Decision Trees|Decision Trees]] || Nina || ||
+
| W 2/10 || [[10-601B Neural Networks|Neural Networks]] || Nina || ||
 
|-
 
|-
| M 2/15 || [[10-601B Boosting and Other Ensembles|Boosting and Other Ensembles]] || Nina || ||
+
| M 2/15 || [[10-601B AdaBoost | AdaBoost]] || Nina || ||
 
|-  
 
|-  
| W 2/17 || [[10-601B Theory 1|Theory 1]] || Nina ||  HW4: Theory || Han, Tianshu
+
| W 2/17 || [[10-601B Generalization and Overfitting: Sample Complexity Results for Supervised Classification | Generalization and Overfitting: Sample Complexity Results for Supervised Classification]] || Nina ||  [http://curtis.ml.cmu.edu/w/courses/images/2/25/10601-Homework-4.pdf HW4: SVM, ANN, Boosting] [http://curtis.ml.cmu.edu/w/courses/images/4/44/Hw4_adaboost.zip HW4 code] || Han, Tianshu
 
|-  
 
|-  
| M 2/22 || [[10-601B Theory 2|Theory 2]] || Nina || ||
+
| M 2/22 || [[10-601B Generalization and Overfitting: Sample Complexity Results for Supervised Classification 2 | Generalization and Overfitting: Sample Complexity Results for Supervised Classification 2]] || Nina || ||
 
|-  
 
|-  
| W 2/24|| Midterm Review || Nina || ||
+
| W 2/24|| [[10-601B Model Selection | Model Selection]] and Midterm Review || Nina || ||
 
|-  
 
|-  
 
| M 2/29 ||colspan="4"| ''Midterm exam''  
 
| M 2/29 ||colspan="4"| ''Midterm exam''  
 
|-  
 
|-  
| W 3/2 || [[10-601B Clustering| Unsupervised Learning: k-Means and Mixtures]] || Nina || ||
+
| W 3/2 || [[10-601B Clustering| Clustering]] || Nina || ||
|-                                                                                    
 
| M 3/7 || [[10-601B SSL| Semi-Supervised Learning]] || Nina || ||
 
|-
 
| W 3/9 || [[10-601B Active Learning|Active Learning]] || Nina || HW5: Active learning and clustering || Travis, Han
 
 
|-
 
|-
| M 3/14 ||colspan="4"| ''Spring break''  
+
| M 3/7 ||colspan="4"| ''Spring break''  
 
|-
 
|-
| W 3/16 ||colspan="4"|  ''Spring break''
+
| W 3/9 ||colspan="4"|  ''Spring break''
 +
|-                                                                             
 +
| M 3/14 || [[10-601B Active Learning|Active Learning]] || Nina ||  ||
 
|-  
 
|-  
 +
| W 3/16 || [[10-601B SSL| Semi-Supervised Learning]] || William || [http://curtis.ml.cmu.edu/w/courses/images/c/cf/Homework5.pdf HW5: Active learning and clustering] || Travis, Han
 +
|-
 
| M 3/21 || [[10-601 GM1| Graphical Models 1]]  || William || ||       
 
| M 3/21 || [[10-601 GM1| Graphical Models 1]]  || William || ||       
 
|-                                                                                      
 
|-                                                                                      
 
| W 3/23 || [[10-601 GM2| Graphical Models 2]] ||  William  || ||  
 
| W 3/23 || [[10-601 GM2| Graphical Models 2]] ||  William  || ||  
 
|-
 
|-
| M 3/28 || [[10-601 Sequences|Graphical Models for Sequential Data]] || William ||HW6: Graphical models || Maria, Renato
+
| M 3/28 || [[10-601 GM3|Graphical Models 3]] || William || ||
 +
|-
 +
| W 3/30 || [[10-601 Sequences|Graphical Models for Sequential Data]] || William ||[http://curtis.ml.cmu.edu/w/courses/images/d/d8/Homework6.pdf HW6: Graphical models] || Maria, Renato
 
|-
 
|-
 
| M 4/4 || [[10-601 Topic Models|Topic Models]] || William || ||
 
| M 4/4 || [[10-601 Topic Models|Topic Models]] || William || ||
 
|-
 
|-
| W 4/6 || Deep Learning 1 || William || ||  
+
| W 4/6 || [[10-601 Deep Learning 1|Deep Learning 1]] || William || ||  
 
|-
 
|-
| M 4/11 || Deep Learning 2 || William || ||
+
| M 4/11 || [[10-601 Deep Learning 2|Deep Learning 2]]  || William || ||
 
|-
 
|-
| W 4/13 ||  [[10-601_Matrix_Factorization|PCA and dimension reduction]] || William || HW7: Deep Learning || TBA
+
| W 4/13 ||  [[10-601_PCA|PCA and dimension reduction]] || William || HW7: Deep Learning || Zichao
 
|-
 
|-
 
| M 4/18 || [[10-601_Matrix_Factorization|Matrix Factorization and collaborative filtering]] || William || ||
 
| M 4/18 || [[10-601_Matrix_Factorization|Matrix Factorization and collaborative filtering]] || William || ||
Line 73: Line 75:
 
| W 4/20 || [[10-601 Reinforcement Learning|Reinforcement Learning]] || Nina || ||
 
| W 4/20 || [[10-601 Reinforcement Learning|Reinforcement Learning]] || Nina || ||
 
|-
 
|-
| M 4/25 || Review || Nina and William || ||
+
| M 4/25 || [[10-601 Review|Review]] || Nina and William || ||
 
|-
 
|-
| W 4/27 || ''Final exam'' ||  || ||
+
| W 4/27 || ''In-class final exam'' ||  || ||
 
|}
 
|}
  

Latest revision as of 21:14, 5 September 2016

This is the syllabus for Machine Learning 10-601 in Spring 2016.

Schedule

Teaching team only: also see the Google Doc Spreadsheet. Students should not try and decipher the scribbles and planning notes on this gdoc - use the schedule below.

Schedule for 10-601
Date Main Topic of Lecture Lecturer Assignment TAs
M 1/11 Course Overview Nina
W 1/13 Intro to Probability William HW1 Background Test Will, Han
M 1/18 Martin Luther King Day
W 1/20 The Naive Bayes algorithm William HW2: implementing naive Bayes Travis, Maria
M 1/25 Logistic Regression William
W 1/27 Linear Regression William
M 2/1 Perceptrons and Large Margin Nina
W 2/3 Kernels Nina HW3: logistic and linear regression Tianshu, Will
M 2/8 Kernelized SVMs and Intro to Neural Networks Nina
W 2/10 Neural Networks Nina
M 2/15 AdaBoost Nina
W 2/17 Generalization and Overfitting: Sample Complexity Results for Supervised Classification Nina HW4: SVM, ANN, Boosting HW4 code Han, Tianshu
M 2/22 Generalization and Overfitting: Sample Complexity Results for Supervised Classification 2 Nina
W 2/24 Model Selection and Midterm Review Nina
M 2/29 Midterm exam
W 3/2 Clustering Nina
M 3/7 Spring break
W 3/9 Spring break
M 3/14 Active Learning Nina
W 3/16 Semi-Supervised Learning William HW5: Active learning and clustering Travis, Han
M 3/21 Graphical Models 1 William
W 3/23 Graphical Models 2 William
M 3/28 Graphical Models 3 William
W 3/30 Graphical Models for Sequential Data William HW6: Graphical models Maria, Renato
M 4/4 Topic Models William
W 4/6 Deep Learning 1 William
M 4/11 Deep Learning 2 William
W 4/13 PCA and dimension reduction William HW7: Deep Learning Zichao
M 4/18 Matrix Factorization and collaborative filtering William
W 4/20 Reinforcement Learning Nina
M 4/25 Review Nina and William
W 4/27 In-class final exam

To other instructors: if you'd like to use any of the materials found here, you're absolutely welcome to do so, but please acknowledge their ultimate source somewhere.

Note from William to William and Nina: there's a copy of the old draft, with William's slides and notes, here

Other Lectures