Difference between revisions of "10-601 Neural networks and Deep Belief Networks"
From Cohen Courses
Jump to navigationJump to searchLine 2: | Line 2: | ||
=== Slides === | === Slides === | ||
− | + | * Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/newNN.pdf Slides in pdf]. | |
* William's slides: [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pptx in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pdf in PDF] | * William's slides: [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pptx in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/nnets.pdf in PDF] | ||
Revision as of 09:21, 24 September 2014
This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
- Ziv's lecture: Slides in pdf.
- William's slides: in Powerpoint, in PDF
Readings
- Mitchell: Ch. 4, or Bishop: Ch. 5
What You Should Know Afterward
- What functions can be expressed with multi-layer networks that a single layer cannot express
- The backpropagation algorithm, and what loss is associated with it
- In outline, how deep neural networks are trained