Difference between revisions of "10-601 Neural networks and Deep Belief Networks"

From Cohen Courses
Jump to navigationJump to search
(Created page with "This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014 === Slides === * TBD === Readings === * Mitchell: Ch. 4, or Bishop: Ch. 5 === What You Sh...")
 
 
(4 intermediate revisions by 2 users not shown)
Line 1: Line 1:
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
+
This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]]
  
 
=== Slides ===
 
=== Slides ===
  
* TBD
+
* William's slides: [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pptx in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pdf in PDF]
  
 
=== Readings ===
 
=== Readings ===
  
* Mitchell: Ch. 4, or Bishop: Ch. 5
+
* Mitchell: Ch. 4, Murphy Ch 16.5
  
 
=== What You Should Know Afterward ===
 
=== What You Should Know Afterward ===
  
* From single layer to multy-layer networks
+
* What functions can be expressed with multi-layer networks that a single layer cannot express
* What can be solved with multy-layer networks that a single layer cannot
+
* The backpropagation algorithm, and what loss is associated with it
* Backpropagation
+
* In outline, how deep neural networks are trained
* Deep neural networks
 
* Application of deep NN
 

Latest revision as of 16:44, 6 January 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

  • Mitchell: Ch. 4, Murphy Ch 16.5

What You Should Know Afterward

  • What functions can be expressed with multi-layer networks that a single layer cannot express
  • The backpropagation algorithm, and what loss is associated with it
  • In outline, how deep neural networks are trained