Difference between revisions of "10-601 Bias-Variance"

From Cohen Courses
Jump to navigationJump to search
Line 8: Line 8:
 
*Mitchell: Chap 5, 6
 
*Mitchell: Chap 5, 6
  
=== Take home message ===
+
=== What you should know ===
  
* Overfitting
+
* How overfitting/underfitting can be understood as a tradeoff between high-bias and high-variance learners.
** kNN
+
* Mathematically, how to decompose error for linear regression into bias and variance.
** Regression
+
* Intuitively, how classification can be decomposed into bias and variance.
 
+
* Which sorts of classifier variants lead to more bias and/or more variance: e.g., large vs small k in k-NN, etc.
* Bias-variance decomposition
 
 
 
* Structural risk minimization
 
 
 
* The battle against overfitting
 
** Cross validation
 
** Regularization
 
** Feature selection
 

Revision as of 17:03, 19 October 2014

Slides

Readings

  • Bishop: Chap 1, 2
  • Mitchell: Chap 5, 6

What you should know

  • How overfitting/underfitting can be understood as a tradeoff between high-bias and high-variance learners.
  • Mathematically, how to decompose error for linear regression into bias and variance.
  • Intuitively, how classification can be decomposed into bias and variance.
  • Which sorts of classifier variants lead to more bias and/or more variance: e.g., large vs small k in k-NN, etc.