Difference between revisions of "10-601 Introduction to Probability"
From Cohen Courses
Jump to navigationJump to search (→Slides) |
|||
| Line 8: | Line 8: | ||
* None | * None | ||
| + | |||
| + | |||
| + | === What You Should Know Afterward === | ||
| + | |||
| + | You should be able to know the definitions of the following: | ||
| + | |||
| + | * Random variables and events | ||
| + | * The Axioms of Probability | ||
| + | * Independence, binomials, multinomials | ||
| + | * Conditional probabilities | ||
| + | * Bayes Rule | ||
| + | * MLE’s, smoothing, and MAPs | ||
| + | * The joint distribution | ||
| + | * Inference | ||
| + | * Density estimation and classification | ||
| + | * Naïve Bayes density estimators and classifiers | ||
| + | * Conditional independence | ||
Revision as of 09:00, 3 July 2013
This a lecture used in the Syllabus for Machine Learning 10-601
Slides
Readings
- None
What You Should Know Afterward
You should be able to know the definitions of the following:
- Random variables and events
- The Axioms of Probability
- Independence, binomials, multinomials
- Conditional probabilities
- Bayes Rule
- MLE’s, smoothing, and MAPs
- The joint distribution
- Inference
- Density estimation and classification
- Naïve Bayes density estimators and classifiers
- Conditional independence