DFKI-LT - Neural Associative Memory for Dual-Sequence Modeling
Neural Associative Memory for Dual-Sequence Modeling
1 Proceedings of the 1st Workshop on Representation Learning for NLP, Berlin, Germany, ACL, 2016
Many important NLP problems can be posed as dual-sequence or sequence-to-sequence modeling tasks. Recent ad- vances in building end-to-end neural ar- chitectures have been highly successful in solving such tasks. In this work we propose a new architecture for dual-sequence modeling that is based on associative memory. We derive AM-RNNs, a recur- rent associative memory (AM) which aug- ments generic recurrent neural networks (RNN). This architecture is extended to the Dual AM-RNN which operates on two AMs at once. Our models achieve very competitive results on textual entailment. A qualitative analysis demon- strates that long range dependencies between source and target-sequence can be bridged effectively using Dual AM-RNNs. However, an initial experiment on auto- encoding reveals that these benefits are not exploited by the system when learn- ing to solve sequence-to-sequence tasks which indicates that additional supervision or regularization is needed.
Files: BibTeX, W16-1630.pdf, W16-1630.pdf