Difference between revisions of "Class meeting for 10-605 Parameter Servers"

From Cohen Courses
Jump to navigationJump to search
Line 10: Line 10:
  
 
=== Optional Readings ===
 
=== Optional Readings ===
 +
 +
* [http://www.cs.cmu.edu/~feixia/files/ps.pdf Parameter Server for Distributed Machine Learning]
  
 
* [https://arxiv.org/pdf/1512.09295v1.pdf Strategies and Principles of Distributed Machine Learning on Big Data]
 
* [https://arxiv.org/pdf/1512.09295v1.pdf Strategies and Principles of Distributed Machine Learning on Big Data]

Revision as of 14:04, 29 November 2016

This is one of the class meetings on the schedule for the course Machine Learning with Large Datasets 10-605 in Fall_2016.

Slides

Quiz

Optional Readings

Things to remember

  • Architecture of a generic parameter server (PS), with get/put access to parameters
  • Pros/cons of asynchronous vs bounded asynchronous vs fully synchronous PS
  • Pros/cons of PS model versus Hadoop plus IPM
  • Stale synchronous parallel (SSP) computation model
  • Data-parallel versus model-parallel algorithms
    • Data-parallel example: SGD on sharded data
    • Model-parallel example: Lasso accounting for parameter dependencies and parameter importance