The importance of reusable, large-scale standard test
collections in Information Access research has been widely
recognized. The success of TREC, CLEF, and NTCIR has clearly
established the importance of an evaluation workshop that
facilitates research by providing the data and a common
forum for comparing models and techniques. The Forum for
Information Retrieval Evaluation (FIRE) follows in the
footsteps of TREC, CLEF and NTCIR with the following aims:
- Encourage research in South Asian language Information
Access technologies by providing reusable large-scale test
collections for ILIR experiments.
- Explore new Information Retrieval / Access tasks that
arise as our information needs evolve, and new needs
- Provide a common evaluation infrastructure for comparing
the performance of different IR systems.
- Investigate evaluation methods for Information Access
techniques and methods for constructing a reusable
large-scale data set for ILIR experiments.
Paul Clough, The University of Sheffield, UK.
Kareem Darwish, Qatar Computing Research Institute (QCRI), Qatar.
Nicola Ferro, University of Padova, Italy.
Marti Hearst, University of California, Berkeley.
Jaap Kamps, University of Amsterdam, The Netherlands.
Henning Müller, University Hospitals and University of Geneva, Switzerland.
Copyright © 2014 IRSI All rights reserved.
This site was last updated 2015-02-03 00:00:00.000000000 +0530