European research letter: Cross‐language system evaluation: The CLEF campaigns
- 1 January 2001
- journal article
- research article
- Published by Wiley in Journal of the American Society for Information Science and Technology
- Vol. 52 (12) , 1067-1072
- https://doi.org/10.1002/asi.1164
Abstract
The goals of the CLEF (Cross‐Language Evaluation Forum) series of evaluation campaigns for information retrieval systems operating on European languages are described. The difficulties of organizing an activity which aims at an objective evaluation of systems running on and over a number of different languages are examined. The discussion includes an analysis of the first results and proposals for possible developments in the future.Keywords
This publication has 9 references indexed in Scilit:
- CLEF 2000 — Overview of ResultsPublished by Springer Nature ,2001
- The Domain-Specific Task of CLEF - Specific Evaluation Strategies in Cross-Language Information RetrievalPublished by Springer Nature ,2001
- CLIR Evaluation at TRECPublished by Springer Nature ,2001
- Cross-Language Information Retrieval and EvaluationPublished by Springer Nature ,2001
- Variations in relevance judgments and the measurement of retrieval effectivenessInformation Processing & Management, 2000
- The eighth text REtrieval conference (TREC-8)Published by National Institute of Standards and Technology (NIST) ,2000
- Cross-Language Information Retrieval (CLIR) Track OverviewPublished by National Institute of Standards and Technology (NIST) ,1999
- How reliable are the results of large-scale information retrieval experiments?Published by Association for Computing Machinery (ACM) ,1998
- Brief Communication: The TREC experiments and their impact on EuropeJournal of Information Science, 1997