Towards Relaxing the Unbiasedness Condition of Doubly Robust Estimators for Debiased Recommendation

24 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Recommender system, Selection bias, Doubly robust
Abstract: Recommender system aims to recommend items or information that may be of interest to users based on their behaviors and preferences. However, there may be sampling selection bias in the process of data collection, i.e., the collected data is not a representative of the target population. Many debiasing methods are developed based on pseudo-labelings. Nevertheless, the effectiveness of these methods relies heavily on accurate pseudo-labelings (i.e., the imputed labels), which is difficult to satisfy in practice. In this paper, in contrast to the existing doubly robust estimators that take strictly accurate pseudo-labelings as an unbiasedness condition, we theoretically propose several novel doubly robust estimators that are unbiased when either (a) the pseudo-labelings deviate from the true labels with an arbitrary user-specific inductive bias, item-specific inductive bias, or a combination of both, or (b) the learned propensities are accurate. We further propose a principled propensity reconstruction learning approach that adaptively updates the constraint weights using an attention mechanism and effectively controls the variance. To summarize, the proposed methods greatly relax the unbiasedness condition of the widely-adopted doubly robust estimators, which empirically result in much lower bias. Extensive experiments show that our approach outperforms the state-of-the-art on one semi-synthetic dataset and three real-world datasets.
Supplementary Material: zip
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9439
Loading