Difference between revisions of "10-601 PAC"

From Cohen Courses
Jump to navigationJump to search
Line 10: Line 10:
 
=== What you should remember ===
 
=== What you should remember ===
  
* Relationships between sample complexity, error bound, and “capacity” of chosen hypothesis space
+
* Definition of pac-learnability.
* Within the PAC learning setting, we can bound the probability that learner will output hypothesis with given error
+
* Definition of sample complexity vs time complexity
** For ANY consistent learner (case where c in H)
+
* How sample complexity grows with 1/epsilon, 1/delta, and |H|
** For ANY “best fit” hypothesis (agnostic learning, where perhaps c not in H)
+
**  in the noise-free case.
 
+
** in the "agnostic" setting, where noise is present and the learner outputs the smallest-error-rate hypothesis.
* VC dimension as measure of complexity of H
+
* The definition of VC-dimension and shattering
 +
* How VC dimension relates to sample complexity

Revision as of 15:23, 9 October 2014

Slides

Readings

  • Mitchell Chapter 7

What you should remember

  • Definition of pac-learnability.
  • Definition of sample complexity vs time complexity
  • How sample complexity grows with 1/epsilon, 1/delta, and |H|
    • in the noise-free case.
    • in the "agnostic" setting, where noise is present and the learner outputs the smallest-error-rate hypothesis.
  • The definition of VC-dimension and shattering
  • How VC dimension relates to sample complexity