Class meeting for 10-605 Parallel Perceptrons 2

From Cohen Courses
Revision as of 15:41, 1 August 2017 by Wcohen (talk | contribs) (→‎Slides)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This is one of the class meetings on the schedule for the course Machine Learning with Large Datasets 10-605 in Fall_2016.

Slides

Perceptrons, continued:

Parallel perceptrons with iterative parameter mixing:

Readings for the Class

Optional Readings

What you should remember

  • The averaged perceptron and the voted perceptron
  • Approaches to parallelizing perceptrons (and other on-line learning methods, like SGD)
    • Parameter mixing
    • Iterative parameter mixing (IPM)
  • The meaning and implications of the theorems given for convergence of the basic perceptron and the IPM version