Recurrent Neural Network Grammars
View Researcher's Other CodesDisclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).
Please contact us in case of a broken link from here
Authors | Miguel Ballesteros, Adhiguna Kuncoro, Noah A. Smith, Chris Dyer |
Journal/Conference Name | NAACL 2016 6 |
Paper Category | Artificial Intelligence |
Paper Abstract | We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese. |
Date of publication | 2016 |
Code Programming Language | Multiple |
Comment |