Keywords: Global pooling, regularized optimal transport, Bregman ADMM, multi-instance learning, graph embedding
TL;DR: We develop a novel and solid global pooling framework through the lens of optimal transport, which covers many existing pooling methods and performs well on various learning problems.
Abstract: Global pooling is one of the most significant operations in many machine learning models and tasks, whose implementation, however, is often empirical in practice. In this study, we develop a novel and solid global pooling framework through the lens of optimal transport. We demonstrate that most existing global pooling methods are equivalent to solving some specializations of an unbalanced optimal transport (UOT) problem. Making the parameters of the UOT problem learnable, we unify various global pooling methods in the same framework, and accordingly, propose a generalized global pooling layer called UOT-Pooling (UOTP) for neural networks. Besides implementing the UOTP layer based on the classic Sinkhorn-scaling algorithm, we design new model architectures based on the Bregman ADMM algorithm, which has comparable complexity but better numerical stability. We test our UOTP layers in several application scenarios, including multi-instance learning, graph classification, and image classification. In these applications, our UOTP layers can either imitate conventional global pooling layers or learn new pooling mechanisms to perform better.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
Supplementary Material: zip
5 Replies
Loading