The importance of reusable, large-scale standard test
collections in Information Access research has been widely
recognized. The success of TREC, CLEF, and NTCIR has clearly
established the importance of an evaluation workshop that
facilitates research by providing the data and a common
forum for comparing models and techniques. The Forum for
Information Retrieval Evaluation (FIRE) follows in the
footsteps of TREC, CLEF and NTCIR with the following aims:
- Encourage research in South Asian language Information
Access technologies by providing reusable large-scale test
collections for ILIR experiments.
- Explore new Information Retrieval / Access tasks that
arise as our information needs evolve, and new needs
- Provide a common evaluation infrastructure for comparing
the performance of different IR systems.
- Investigate evaluation methods for Information Access
techniques and methods for constructing a reusable
large-scale data set for ILIR experiments.
Paul Clough, The University of Sheffield, UK.
Julio Gonzalo, UNED, Spain.
Gareth Jones, Dublin City University, Ireland.
Jaap Kamps, University of Amsterdam, The Netherlands.
Henning Müller, University Hospitals and University of Geneva, Switzerland.
Kareem Darwish, Qatar Computing Research Institute (QCRI), Qatar.
Charlie Clarke, University of Waterloo, Canada.
Nicola Ferro, University of Padova, Italy.
Copyright © 2014 IRSI All rights reserved.
This site was last updated 2014-11-03 00:00:00.000000000 +0530