Objective


The importance of reusable, large-scale standard test collections in Information Access research has been widely recognized. The success of TREC, CLEF, and NTCIR has clearly established the importance of an evaluation workshop that facilitates research by providing the data and a common forum for comparing models and techniques. The Forum for Information Retrieval Evaluation (FIRE) follows in the footsteps of TREC, CLEF and NTCIR with the following aims:

  • Encourage research in South Asian language Information Access technologies by providing reusable large-scale test collections for ILIR experiments.
  • Explore new Information Retrieval / Access tasks that arise as our information needs evolve, and new needs emerge.
  • Provide a common evaluation infrastructure for comparing the performance of different IR systems.
  • Investigate evaluation methods for Information Access techniques and methods for constructing a reusable large-scale data set for ILIR experiments.

Tasks



Invited speakers

Industry track speakers