Difference between revisions of "10-601 SVMS"
From Cohen Courses
Jump to navigationJump to searchLine 15: | Line 15: | ||
=== What You Should Know Afterward === | === What You Should Know Afterward === | ||
− | * | + | * The definitions of, and intuitions behind, these concepts: |
+ | ** The ''margin'' of a classifier relative to a dataset. | ||
+ | ** What a ''constrained optimization problem'' is. | ||
+ | ** The ''primal form'' of the SVM optimization problem. | ||
+ | ** The ''dual form'' of the SVM optimization problem. | ||
+ | ** What a ''support vector'' is. | ||
+ | ** What a ''kernel function'' is. |
Revision as of 10:04, 18 September 2013
This a lecture used in the Syllabus for Machine Learning 10-601
Slides
Readings
Assignment
- None
What You Should Know Afterward
- The definitions of, and intuitions behind, these concepts:
- The margin of a classifier relative to a dataset.
- What a constrained optimization problem is.
- The primal form of the SVM optimization problem.
- The dual form of the SVM optimization problem.
- What a support vector is.
- What a kernel function is.