File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.18653/v1/e17-1117
- Scopus: eid_2-s2.0-85021687903
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: What do recurrent neural network grammars learn about syntax?
Title | What do recurrent neural network grammars learn about syntax? |
---|---|
Authors | |
Issue Date | 2017 |
Citation | The 15th Conference of the European Chapter of the Association for Computational Linguistics, Valencia, Spain, 3-7 April 2017. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, 2017, v. 1, p. 1249-1258 How to Cite? |
Abstract | © 2017 Association for Computational Linguistics. Recurrent neural network grammars (RNNG) are a recently proposed probablistic generative modeling family for natural language. They show state-ofthe- Art language modeling and parsing performance. We investigate what information they learn, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism (GA-RNNG) to enable closer inspection. We find that explicit modeling of composition is crucial for achieving the best performance. Through the attention mechanism, we find that headedness plays a central role in phrasal representation (with the model's latent attention largely agreeing with predictions made by hand-crafted head rules, albeit with some important differences). By training grammars without nonterminal labels, we find that phrasal representations depend minimally on nonterminals, providing support for the endocentricity hypothesis. |
Persistent Identifier | http://hdl.handle.net/10722/296152 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kuncoro, Adhiguna | - |
dc.contributor.author | Ballesteros, Miguel | - |
dc.contributor.author | Kong, Lingpeng | - |
dc.contributor.author | Dyer, Chris | - |
dc.contributor.author | Neubig, Graham | - |
dc.contributor.author | Smith, Noah A. | - |
dc.date.accessioned | 2021-02-11T04:52:57Z | - |
dc.date.available | 2021-02-11T04:52:57Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | The 15th Conference of the European Chapter of the Association for Computational Linguistics, Valencia, Spain, 3-7 April 2017. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, 2017, v. 1, p. 1249-1258 | - |
dc.identifier.uri | http://hdl.handle.net/10722/296152 | - |
dc.description.abstract | © 2017 Association for Computational Linguistics. Recurrent neural network grammars (RNNG) are a recently proposed probablistic generative modeling family for natural language. They show state-ofthe- Art language modeling and parsing performance. We investigate what information they learn, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism (GA-RNNG) to enable closer inspection. We find that explicit modeling of composition is crucial for achieving the best performance. Through the attention mechanism, we find that headedness plays a central role in phrasal representation (with the model's latent attention largely agreeing with predictions made by hand-crafted head rules, albeit with some important differences). By training grammars without nonterminal labels, we find that phrasal representations depend minimally on nonterminals, providing support for the endocentricity hypothesis. | - |
dc.language | eng | - |
dc.relation.ispartof | Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.title | What do recurrent neural network grammars learn about syntax? | - |
dc.type | Conference_Paper | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.doi | 10.18653/v1/e17-1117 | - |
dc.identifier.scopus | eid_2-s2.0-85021687903 | - |
dc.identifier.volume | 1 | - |
dc.identifier.spage | 1249 | - |
dc.identifier.epage | 1258 | - |