Keywords: Temporal Sets, Temporal Data, Sequence Learning, Sequential Sets, Deep Learning, Temporal Data Forecasting
TL;DR: A scalable framework for temporal set prediction using structured set transformations that reduces compute cost while achieving state-of-the-art performance.
Abstract: Temporal set prediction involves forecasting the elements that will appear in the next set, given a sequence of prior sets, each containing a variable number of elements. Existing methods often rely on complex architectures with substantial computational overhead, limiting their scalability. In this work, we introduce a novel and scalable framework that leverages permutation-equivariant and permutation-invariant transformations to efficiently model set dynamics. Our approach significantly reduces training and inference time while maintaining competitive performance. Extensive experiments on multiple public datasets demonstrate that our method matches or surpasses state-of-the-art models across several evaluation metrics. These results highlight the effectiveness of our model in enabling efficient and scalable temporal set prediction.
Primary Area: Deep learning (e.g., architectures, generative models, optimization for deep networks, foundation models, LLMs)
Submission Number: 23524
Loading