Skip to main content Skip to main navigation

Publikation

DartsReNet: Exploring NAS for RNNs applied to Image Classification

Brian Moser; Federico Raue; Jörn Hees; Andreas Dengel
In: Masulli Farka¨ (Hrsg.). Artificial Neural Networks and Machine Learning ICANN 2020. International Conference on Artificial Neural Networks (ICANN-2020), International Conference on Artificial Neural Networks, Springer, 2020.

Zusammenfassung

We present new Recurrent Neural Network (RNN) cells forimage classification using a Neural Architecture Search (NAS) approachcalled DARTS. We are interested in the ReNet architecture, which is a RNN based approach presented as an alternative for convolutional and pooling steps. ReNet can be defined using any standard RNN cells, such as LSTM and GRU. One limitation is that standard RNN cells were designed for one dimensional sequential data and not for two dimensions like it is the case for image classification. We overcome this limitation by using DARTS to find new cell designs. We compare our results withReNet that uses GRU and LSTM cells. Our found cells outperform thestandard RNN cells on CIFAR-10 and SVHN. The improvements on SVHN indicate generalizability, as we derived the RNN cell designs fromCIFAR-10 without performing a new cell search for SVHN

Projekte