Attention-Based Neural Network: A Novel Approach for Predicting the Popularity of Online Content

Published: 2019, Last Modified: 16 May 2025HPCC/SmartCity/DSS 2019EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Since the rate at which new content is uploaded to the Internet has reached unprecedented marks, knowing the popularity of online content, especially video, is of importance for network management, recommendation schemes, service design, advertising planning, and so on. Despite the fact that various models have been developed, few of them address the short-term popularity prediction. Toward this goal, we exploit the self-attention mechanism of the Transformer, a state-of-the-art model in neural machine translation, to forecast the values of multiple time series in the near future. Specifically, we propose an attention-based non-recursive neural network, a novel model that entirely dispenses with recurrence and convolutions, for time series prediction. Since our model is the combination of the input attention mechanism in the dual-stage attention-based recurrent neural network (DA-RNN) and the self-attention of Transformer, it is able to adaptively select the most relevant input sequences as well as capture the long-term dependencies across previous time steps to make the prediction. The experiments show the root mean square errors (RMSEs) achieved by our model are only 6.06 and 3.60 when testing on the NASDAQ 100 dataset and the views count of the top most popular videos on Youtube respectively, while the RMSEs of DA-RNN are 8.52 and 12.31. Hence, our model outperforms the baseline not only in time series prediction but also in contents popularity prediction aspect.
Loading