Q-Ensemble for Offline RL: Don't Scale the Ensemble, Scale the Batch SizeDownload PDF

Published: 01 Feb 2023, Last Modified: 13 Feb 2023Submitted to ICLR 2023Readers: Everyone
Keywords: Offline Reinforcement Learning, Q-Ensemble, Large Batch Optimization, Ensemble Based Reinforcement Learning
TL;DR: Large Batch Optimization for SAC-N allows to reduce size of the Q-ensemble and improves convergence time by 2.5x times on average
Abstract: Training large neural networks is known to be time-consuming, where learning duration may stretch up to days or weeks. To address this problem, the approach of large-batch optimization was introduced, demonstrating that scaling mini-batch sizes with appropriate learning rate adjustments may speed up the training process by orders of magnitude. While long training time was not typically a major issue for model-free deep offline RL algorithms, recently introduced Q-ensemble methods achieving state-of-the-art performance made this issue more relevant, notably extending the training duration. In this work, we demonstrate how large-batch optimization, typically overlooked in deep offline RL community, can benefit this class of methods. We show that simply scaling the mini-batch size and naively adjusting the learning rate allows for (1) a reduced size of the Q-ensemble, (2) stronger penalization of out-of-distribution actions, and (3) improved convergence time, effectively shortening training durations by 2.5x times on average.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Reinforcement Learning (eg, decision and control, planning, hierarchical RL, robotics)
Supplementary Material: zip
15 Replies

Loading