Pooling-based continuous evaluation of information retrieval systems

Tonon, Alberto, Demartini, Gianluca and Cudre-Mauroux, Philippe (2015) Pooling-based continuous evaluation of information retrieval systems. Information Retrieval Journal, 18 5: 445-472. doi:10.1007/s10791-015-9266-y


Author Tonon, Alberto
Demartini, Gianluca
Cudre-Mauroux, Philippe
Title Pooling-based continuous evaluation of information retrieval systems
Journal name Information Retrieval Journal   Check publisher's open access policy
ISSN 1573-7659
1386-4564
Publication date 2015-10-01
Sub-type Article (original research)
DOI 10.1007/s10791-015-9266-y
Open Access Status Not yet assessed
Volume 18
Issue 5
Start page 445
End page 472
Total pages 28
Place of publication Heidelberg, Germany
Publisher Kluwer Academic Publishers
Language eng
Formatted abstract
The dominant approach to evaluate the effectiveness of information retrieval (IR) systems is by means of reusable test collections built following the Cranfield paradigm. In this paper, we propose a new IR evaluation methodology based on pooled test-collections and on the continuous use of either crowdsourcing or professional editors to obtain relevance judgements. Instead of building a static collection for a finite set of systems known a priori, we propose an IR evaluation paradigm where retrieval approaches are evaluated iteratively on the same collection. Each new retrieval technique takes care of obtaining its missing relevance judgements and hence contributes to augmenting the overall set of relevance judgements of the collection. We also propose two metrics: Fairness Score, and opportunistic number of relevant documents, which we then use to define new pooling strategies. The goal of this work is to study the behavior of standard IR metrics, IR system ranking, and of several pooling techniques in a continuous evaluation context by comparing continuous and non-continuous evaluation results on classic test collections. We both use standard and crowdsourced relevance judgements, and we actually run a continuous evaluation campaign over several existing IR systems.
Keyword Continuous evaluation
Crowdsourcing
Information retrieval evaluation
Poolingtechniques
Q-Index Code C1
Q-Index Status Provisional Code
Institutional Status Non-UQ

Document type: Journal Article
Sub-type: Article (original research)
Collection: School of Information Technology and Electrical Engineering Publications
 
Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 6 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 11 times in Scopus Article | Citations
Google Scholar Search Google Scholar
Created: Sat, 26 Aug 2017, 01:00:51 EST by Web Cron on behalf of Learning and Research Services (UQ Library)