SUMMARY : Session O24-EW Evaluation of Information Retrieval

 

Title Tagging Heterogeneous Evaluation Corpora for Opinionated Tasks
Authors L. Ku, Y. Liang, H. Chen
Abstract Opinion retrieval aims to tell if a document is positive, neutral or negative on a given topic. Opinion extraction further identifies the supportive and the non-supportive evidence of a document. To evaluate the performance of technologies for opinionated tasks, a suitable corpus is necessary. This paper defines the annotations for opinionated materials. Heterogeneous experimental materials are annotated, and the agreements among annotators are analyzed. How human can monitor opinions of the whole is also examined. The corpus can be employed to opinion extraction, opinion summarization, opinion tracking and opinionated question answering.
Keywords
Full paper Tagging Heterogeneous Evaluation Corpora for Opinionated Tasks