A1596
Title: Isotonic mechanism for exponential family estimation in machine learning peer review
Authors: Yuling Yan - University of Wisconsin-Madison (United States) [presenting]
Abstract: In 2023, the International Conference on Machine Learning (ICML) required authors with multiple submissions to rank their submissions based on perceived quality. The aim is to employ these author-specified rankings to enhance peer review in machine learning conferences by extending the isotonic mechanism to exponential family distributions. This mechanism generates adjusted scores that closely align with the original scores while adhering to author-specified rankings. Despite its applicability to a broad spectrum of exponential family distributions, implementing this mechanism does not require knowledge of the specific distribution form. It is demonstrated that an author is incentivized to provide accurate rankings when her utility takes the form of a convex additive function of the adjusted review scores. For a certain subclass of exponential family distributions, it is proven that the author reports truthfully \textit{only if} the question involves only pairwise comparisons between her submissions, thus indicating the optimality of ranking in truthful information elicitation. Moreover, it is shown that the adjusted scores dramatically improve estimation accuracy compared to the original scores and achieve nearly minimal optimality. A numerical analysis of the ICML 2023 ranking data is concluded with, showing substantial estimation gains in approximating a proxy ground-truth quality of the papers using the isotonic mechanism.