Abstract: Neural Architecture Search (NAS) had produced highly competitive results in generating neural architectures for many deep learning applications, some of them achieving state of the art performances. Even though there are many Recurrent Neural Network (RNN) variations like Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), etc., available for electric load demand forecasting, finding an optimal internal structure of RNN is still of much interest. This work uses one of the NAS algorithms - Differentiable Architecture Search (DARTS) to generate a new RNN cell optimum for electric load demand forecasting. The generated RNN cell is used to construct models of different complexity - from single cell model to multi-layer models obtained by stacking these RNN cells appropriately. These models are compared with other popular RNN models, and the results establish the advantage of customizing the internal RNN structure over the general RNN variants.
External IDs:dblp:conf/tencon/BijuPS19
Loading