Faculty Recruiting Support CICS

CIIR Talk Series: Coopetitions for IR Evaluation

14 Oct
Friday, 10/14/2022 1:30pm to 2:30pm
CS 150/151; Zoom

Title: Coopetitions for IR Evaluation

Abstract: Coopetitions are activities in which competitors cooperate for a common good. Community evaluations such as the Text REtrieval Conference (TREC, trec.nist.gov) are prototypical examples of coopetitions for search and have now been part of the field for more than thirty years. This longevity and the proliferation of shared evaluation tasks suggest that, indeed, the net impact of community evaluations is positive. But what are these benefits, and what are the attendant costs? This talk will use TREC tracks as case studies to explore the benefits and disadvantages of different evaluation task designs. Coopetitions can improve state-of-the-art effectiveness for a retrieval task by establishing a research cohort and constructing the infrastructure---including problem definition, test collections, scoring metrics, and research methodology---necessary to make progress on the task. They can also facilitate technology transfer and amortize the infrastructure costs. The primary danger of coopetitions is for an entire research community to overfit to some peculiarity of the evaluation task. This risk can be minimized by building multiple test sets and regularly updating the evaluation task.

Bio: Ellen Voorhees is a Fellow at the US National Institute of Standards and Technology (NIST). For most of her tenure at NIST she managed the Text REtrieval Conference (TREC) project, a project that develops the infrastructure required for large-scale evaluation of search engines and other information access technology. Currently she is examining how best to bring the benefits of large-scale community evaluations to bear on the problems of trustworthy AI. Voorhees' general research focuses on developing and validating appropriate evaluation schemes to measure system effectiveness for diverse user tasks.

Voorhees is a fellow of the ACM, a member of the ACM SIGIR Academy, and has been elected as a fellow of the Washington Academy of Sciences. She has published numerous articles on information retrieval techniques and evaluation methodologies and serves on the review boards of several journals and conferences.

To attend this talk via Zoom, click here. To obtain the passcode for this series, please see the event advertisement on the seminars email list or reach out to Alex Taubman. For any questions about this event with the Center for Intelligent Information Retrieval, please contact Jean Joyce.