The importance of reusable, large-scale standard test
collections in Information Access research has been widely
recognized. The success of TREC, CLEF, and NTCIR has clearly
established the importance of an evaluation workshop that
facilitates research by providing the data and a common
forum for comparing models and techniques. The Forum for
Information Retrieval Evaluation (FIRE) follows in the
footsteps of TREC, CLEF and NTCIR with the following aims:
- Encourage research in South Asian language Information
Access technologies by providing reusable large-scale test
collections for ILIR experiments.
- Explore new Information Retrieval / Access tasks that
arise as our information needs evolve, and new needs
- Provide a common evaluation infrastructure for comparing
the performance of different IR systems.
- Investigate evaluation methods for Information Access
techniques and methods for constructing a reusable
large-scale data set for ILIR experiments.
Doug Oard, University of Maryland, USA.
Jaap Kamps, University of Amsterdam, The Netherlands.
Jacques Savoy, Universite de Neuchatel, Switzerland.
Kevyn Collins-Thompson, University of Michigan.
Marie-Francine Moens, Katholieke Universiteit Leuven, Belgium.
Nicola Ferro, University of Padova, Italy.
Paulo Quaresma, Universidade de Évora, Portugal.
Paolo Rosso, Universidad Politécnica de Valencia, Spain.
William Webber, William Webber Consulting, North Melbourne, Australia.
Copyright © 2014 IRSI All rights reserved.
This page was last updated 11/01/2014 10:13:54 IST.