Convex quadratic programming-based predictors: An algorithmic framework and a study of possibilities and computational challenges
Confirmation: I have read and agree with the workshop's policy on behalf of myself and my co-authors.
Tracks: Main Track
Keywords: Time-series prediction, Inverse optimization, Bilevel optimization, Optimization-based predictive models
TL;DR: We present a class of predictive models based on convex quadratic programming, analyze the properties and prediction performance, and present a a two-stage heuristic training algorithm.
Abstract: We present a class of predictive models for forecasting time-series data, referred to as convex quadratic programming-based (CQPB) predictors. The predictions are computed from the minimizer of a convex quadratic problem, where previous observations are integrated as parameters. The remaining parameters, including constraints and objective coefficients, are trainable parameters. This work investigates the predictive capabilities of CQPB predictors and the computational challenges in their training. We analyze their properties and prove that this class of predictors includes classical autoregressive (AR) models, thus forming a generalization of AR models. The training problem is formulated as a bilevel optimization problem. To solve these training problems efficiently, we propose a two-stage heuristic algorithm based on the block coordinate descent approach. The results highlight the potential of CQPB predictors. Although training is challenging, our approach efficiently computes good solutions for moderate-size datasets.
Submission Number: 68
Loading