Difference between revisions of "10-601 Decision Trees"
From Cohen Courses
Jump to navigationJump to search (→Slides) |
|||
Line 4: | Line 4: | ||
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/DT.pdf Slides in pdf]. | * Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/DT.pdf Slides in pdf]. | ||
− | * William's lecture: [http://www.cs.cmu.edu/~wcohen/10-601/decision-trees.pptx Slides in Powerpoint]. | + | * William's lecture: [http://www.cs.cmu.edu/~wcohen/10-601/decision-trees.pptx Slides in Powerpoint], [http://www.cs.cmu.edu/~wcohen/10-601/decision-trees.pdf in PDF]. |
=== Readings === | === Readings === |
Revision as of 14:43, 17 September 2014
This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
- Ziv's lecture: Slides in pdf.
- William's lecture: Slides in Powerpoint, in PDF.
Readings
- Mitchell, Chapter 3.
What You Should Know Afterward
- What a decision tree is, and how to classify an instance using a decision tree.
- What the canonical top-down algorithm is for learning a decision tree.
- What heuristics are used for choosing a decision-tree split.
- What entropy is, and what information gain is.
- What reduced-error pruning is, and why it might improve classification performance.
- Some of the advantages and disadvantages of decision-tree learning in specific, and eager learning in general, compared to K-NN learning.