Difference between revisions of "10-601 Linear Regression"
From Cohen Courses
Jump to navigationJump to searchLine 12: | Line 12: | ||
* Optional: | * Optional: | ||
** Bishop 3.1 | ** Bishop 3.1 | ||
− | + | ** There's also a nice but somewhat less technical [https://www.youtube.com/watch?v=DQWI1kvmwRg video lecture] on overfitting and bias-variance | |
− | * | ||
− | * | ||
=== What You Should Know Afterward === | === What You Should Know Afterward === |
Latest revision as of 10:11, 27 January 2016
This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016
Slides
- William's lecture: Slides in Powerpoint, in PDF.
- Side note: The bias-variance decomposition.
Readings
- Mitchell 4.1-4.3
- Murphy: 7.1-7.3, 7.5.1
- Optional:
- Bishop 3.1
- There's also a nice but somewhat less technical video lecture on overfitting and bias-variance
What You Should Know Afterward
- Regression vs. classification
- Solving regression problems with 1 and 2 variables
- Ordinary least squares (OLS) solution (aka normal equations) to linear regression problems
- Gradient descent approach to linear regression
- Data transformation and its impact on the way linear regression is solved, and the expressiveness of LR models