Difference between revisions of "BinLu et al. ACL2011"
Line 34: | Line 34: | ||
== Related papers == | == Related papers == | ||
− | + | In sense of multilingual sentiment analysis, there several works like: | |
− | * | + | * Paper:Learning multilingual subjective language via cross-lingual projections:[http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CCkQFjAA&url=http%3A%2F%2Fwww.cse.unt.edu%2F~rada%2Fpapers%2Fmihalcea.acl07.pdf&ei=ya9jUPjoO-Ss0AGclYCADQ&usg=AFQjCNGOAgFeF9JuXeLU2fRm8ufgngIo9A&sig2=SA46HbDTWKz-Za2cJF4gQA] |
− | * | + | * Paper:Multilingual subjectivity: Are more languages better?:[http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CCkQFjAA&url=http%3A%2F%2Fwww.aclweb.org%2Fanthology%2FC10-1004&ei=ZrBjUJjNFZG50QHK6YCoAw&usg=AFQjCNHRCsrDKNxJNqCTTBeD1QwbmYy-jA&sig2=xmKv7ju_wTZfvk3uDBN4NQ] |
+ | * Paper:Cross-language text classification using structural correspondence learning.:[http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CCsQFjAA&url=http%3A%2F%2Fwww.aclweb.org%2Fanthology%2FP10-1114&ei=w7BjUJTeCIWX0QHDzoCYAw&usg=AFQjCNGhsIbjWrUFxl2tbBmV62jU5xVEIg&sig2=W302oRTAKZJk07-6XEkFzg] | ||
+ | |||
+ | In sense of semi-supervised learning, related papers include: | ||
+ | * Paper:Combining labeled and unlabeled data with co-training:[http://l2r.cs.uiuc.edu/~danr/Teaching/CS598-05/Papers/cotraining.pdf] | ||
+ | * Paper:Text classification from labeled and unlabeled documents using EM.:[http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CCsQFjAA&url=http%3A%2F%2Fwww.kamalnigam.com%2Fpapers%2Femcat-mlj99.pdf&ei=e7FjUNKnArDD0AG40IGIDA&usg=AFQjCNG6Xo2O3_FDjavdaShiNCl1Fb84SA&sig2=gyqDvYY8Xb--CCsSJ-vPsA] | ||
+ | |||
Revision as of 20:53, 26 September 2012
Contents
Citation
Joint Bilinguial Sentiment Classification with Unlabeled Parallel Corpora, Bin Lu, Chenhao Tan, Claire Cardie and Benjamin K. Tsou, ACL 2011
Online version
Joint Bilingual Sentiment Classification with Unlabeled Parallel Corpora
Summary
This paper address the Sentiment analysis problem on sentence level for multiple languages. They propose to leverage parallel corpora to learn a MaxEnt-based EM model that consider both languages simultaneously under the assumption that sentiment labels for parallel sentences should be similar.
The experimented on 2 dataset: MPQA Multi-Perspective Question Answering and NTCIR-6 Opinion
Evaluation
They evaluate their methods by asking following 4 questions :
- Does NF find out meaningful neighborhoods? - How close is Approximate NF to exact NF? - Can AD detect injected anomalies? - How much time these methods take to run on graphs of varying sizes?
Discussion
This paper poses two important social problems related to bipartite social graphs and explained how those problems can be solved efficiently using random walks.
They also claim that the neighborhoods over nodes can represent personalized clusters depending on different perspectives.
During presentation one of the audiences raised question about is anomaly detection in this paper similar to betweenness of edges defined in Kleinber's text as discussed in Class Meeting for 10-802 01/26/2010. I think they are similar. In the texbook they propose, detecting edges with high betweenness and using them to partition the graph. In this paper they first try to create neighbourhood partitions based on random walk prbabilities and which as a by product gives us nodes and edges with high betweenness value.
Related papers
In sense of multilingual sentiment analysis, there several works like:
- Paper:Learning multilingual subjective language via cross-lingual projections:[1]
- Paper:Multilingual subjectivity: Are more languages better?:[2]
- Paper:Cross-language text classification using structural correspondence learning.:[3]
In sense of semi-supervised learning, related papers include:
- Paper:Combining labeled and unlabeled data with co-training:[4]
- Paper:Text classification from labeled and unlabeled documents using EM.:[5]
Study plan
- Article:Expectation Maximization Algorithm:Expectation-maximization algorithm
- Article:Maximum Entropy Model:Maximum Entropy model
- Paper:Combining labeled and unlabeled data with co-training:[6]
- Paper:Learning multilingual subjective language via cross-lingual projections:[7]