Difference between revisions of "10-601 Introduction to Probability"
From Cohen Courses
Jump to navigationJump to search (→Slides) |
|||
Line 3: | Line 3: | ||
=== Slides === | === Slides === | ||
− | * [http://www.cs.cmu.edu/~wcohen/10-601/prob-tour+bayes.pptx Slides in Powerpoint] - '''William'''. | + | * [http://www.cs.cmu.edu/~wcohen/10-601/prob-tour+bayes.pptx Slides in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/prob-tour+bayes.pdf in PDF] - '''William'''. |
* [http://www.cs.cmu.edu/~zivbj/classF14/introduction.pdf Slides in pdf] - '''Ziv'''. | * [http://www.cs.cmu.edu/~zivbj/classF14/introduction.pdf Slides in pdf] - '''Ziv'''. | ||
Revision as of 13:44, 17 September 2014
This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
- Slides in Powerpoint, in PDF - William.
- Slides in pdf - Ziv.
Readings
- Mitchell Chap 1,2; 6.1-6.3.
What You Should Know Afterward
You should know the definitions of the following, and be able to use them to solve problems:
- Random variables and events
- The Axioms of Probability
- Independence, binomials, multinomials
- Expectation and variance of a distribution
- Conditional probabilities
- Bayes Rule
- MLE’s, smoothing, and MAPs
- The joint distribution
- How to do inference using the joint distribution
- Density estimation and classification