10-601B Kernelized SVMs

From Cohen Courses
Revision as of 22:34, 8 February 2016 by Tdick (talk | contribs) (→‎What You Should Know Afterward)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

What You Should Know Afterward

  • The definitions of, and intuitions behind, these concepts:
    • The margin of a classifier relative to a dataset.
    • What a constrained optimization problem is.
    • The primal form of the SVM optimization problem.
    • The dual form of the SVM optimization problem.
  • What a support vector is.
  • What slack variables are and why and when they are used in SVMs.
  • How to explain the different parts (constraints, optimization criteria) of the primal and dual forms for the SVM.
  • How to Kernelize SVM