Difference between revisions of "10-601 Neural networks and Deep Belief Networks"

From Cohen Courses
Jump to navigationJump to search
Line 1: Line 1:
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
+
This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]]
  
 
=== Slides ===
 
=== Slides ===
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/newNN.pdf Slides in pdf].
+
 
 
* William's slides: [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pptx in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pdf in PDF]
 
* William's slides: [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pptx in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pdf in PDF]
  
 
=== Readings ===
 
=== Readings ===
  
* Mitchell: Ch. 4, or Bishop: Ch. 5
+
* Mitchell: Ch. 4
  
 
=== What You Should Know Afterward ===
 
=== What You Should Know Afterward ===

Revision as of 16:41, 6 January 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

  • Mitchell: Ch. 4

What You Should Know Afterward

  • What functions can be expressed with multi-layer networks that a single layer cannot express
  • The backpropagation algorithm, and what loss is associated with it
  • In outline, how deep neural networks are trained