Publikation

A Crowdsourcing Approach to Evaluate the Quality of Query-based Extractive Text Summaries

Neslihan Iskender, Aleksandra Gabryszak, Tim Polzehl, Leonhard Hennig, Sebastian Möller

In: 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX). International Conference on Quality of Multimedia Experience (QoMEX-2019) June 5-7 Berlin Germany Seiten 1-3 QoMEX IEEE 6/2019.

Abstrakt

High cost and time consumption are concurrent barriers for research and application of automated summarization. In order to explore options to overcome this barrier, we analyze the feasibility and appropriateness of micro-task crowdsourcing for evaluation of different summary quality characteristics and report an ongoing work on the crowdsourced evaluation of query-based extractive text summaries. To do so, we assess and evaluate a number of linguistic quality factors such as grammaticality, non-redundancy, referential clarity, focus and structure & coherence. Our first results imply that referential clarity, focus and structure & coherence are the main factors effecting the perceived summary quality by crowdworkers. Further, we compare these results using an initial set of expert annotations that is currently being collected, as well as an initial set of automatic quality score ROUGE for summary evaluation. Preliminary results show that ROUGE does not correlate with linguistic quality factors, regardless if assessed by crowd or experts.Further, crowd and expert ratings show highest degree of correlation when assessing low quality summaries. Assessments increasingly divert when attributing high quality judgments.

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence