File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1162/tacl_a_00476
- Scopus: eid_2-s2.0-85132612784
- WOS: WOS:000923421200005
Supplementary
- Citations:
- Appears in Collections:
Article: Relational Memory-Augmented Language Models
| Title | Relational Memory-Augmented Language Models |
|---|---|
| Authors | |
| Issue Date | 2022 |
| Citation | Transactions of the Association for Computational Linguistics, 2022, v. 10, p. 555-572 How to Cite? |
| Abstract | We present a memory-augmented approach to condition an autoregressive language model on a knowledge graph. We represent the graph as a collection of relation triples and retrieve relevant relations for a given context to improve text generation. Experiments on WikiText-103, WMT19, and enwik8 English datasets demonstrate that our approach produces a better language model in terms of perplexity and bits per character. We also show that relational memory improves coherence, is complementary to token-based memory, and enables causal interventions. Our model provides a simple yet effective way to combine an autoregressive language model and a knowledge graph for more coherent and logical generation. |
| Persistent Identifier | http://hdl.handle.net/10722/321997 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Liu, Qi | - |
| dc.contributor.author | Yogatama, Dani | - |
| dc.contributor.author | Blunsom, Phil | - |
| dc.date.accessioned | 2022-11-03T02:22:54Z | - |
| dc.date.available | 2022-11-03T02:22:54Z | - |
| dc.date.issued | 2022 | - |
| dc.identifier.citation | Transactions of the Association for Computational Linguistics, 2022, v. 10, p. 555-572 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/321997 | - |
| dc.description.abstract | We present a memory-augmented approach to condition an autoregressive language model on a knowledge graph. We represent the graph as a collection of relation triples and retrieve relevant relations for a given context to improve text generation. Experiments on WikiText-103, WMT19, and enwik8 English datasets demonstrate that our approach produces a better language model in terms of perplexity and bits per character. We also show that relational memory improves coherence, is complementary to token-based memory, and enables causal interventions. Our model provides a simple yet effective way to combine an autoregressive language model and a knowledge graph for more coherent and logical generation. | - |
| dc.language | eng | - |
| dc.relation.ispartof | Transactions of the Association for Computational Linguistics | - |
| dc.title | Relational Memory-Augmented Language Models | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1162/tacl_a_00476 | - |
| dc.identifier.scopus | eid_2-s2.0-85132612784 | - |
| dc.identifier.volume | 10 | - |
| dc.identifier.spage | 555 | - |
| dc.identifier.epage | 572 | - |
| dc.identifier.eissn | 2307-387X | - |
| dc.identifier.isi | WOS:000923421200005 | - |
