Difference between revisions of "Class Meeting for 10-707 10/11/2010"
From Cohen Courses
Jump to navigationJump to search
(Created page with 'This is one of the class meetings on the schedule for the course Information Extraction 10-707 in Fall 2010. === …') |
m (1 revision) |
(No difference)
|
Latest revision as of 10:42, 3 September 2010
This is one of the class meetings on the schedule for the course Information Extraction 10-707 in Fall 2010.
Long-range interactions
Required Readings
- Collective segmentation and labeling of distant entities in information extraction, by C. Sutton, A. McCallum. In ICML workshop on Statistical Relational Learning, 2004.
- An effective two-stage model for exploiting non-local dependencies in named entity recognition, by Vijay Krishnan, Christopher D Manning. In ACL-44: Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics, 2006.
Optional Readings
- Ranking Algorithms for Named-Entity Extraction: Boosting and the Voted Perceptron., Collins, ACL 2002. Comparing VP and boosting for reranking.
- Semi-Markov Conditional Random Fields for Information Extraction, Sarawagi & Cohen, NIPS 2004. The semi-markov CRF I mentioned briefly in class.
- A Hybrid Markov/Semi-Markov Conditional Random Field. for Sequence Segmentation, Andrew, EMNLP 2006. Followup work on semi-Markov models.
- Improving the Scalability of Semi-Markov Conditional Random Fields for Named Entity Recognition, Okanohara et al, ACL 2006. Followup work on semi-Markov models.
- Integer Linear Programming Inference for Conditional Random Fields, Roth and Yih, ICML 2005
- Collective Information Extraction with Relational Markov Networks, Bunescu & Mooney, ACL 2004. A very general approach to long-distance dependencies.
- Feature-Rich Part-of-Speech Tagging with a Cyclic Dependency Network, Toutanova et al, NAACL 2003. An approach based on "dependency nets", another inference scheme that allows long-range dependencies to be modeled easily.