File Download

There are no files associated with this item.

Supplementary

Conference Paper: DYLE: Dynamic Latent Extraction for Abstractive Long-Input Summarization

TitleDYLE: Dynamic Latent Extraction for Abstractive Long-Input Summarization
Authors
Issue Date2022
PublisherAssociation for Computational Linguistics.
Citation
The 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the 60th Annual Meeting of ACL, p. 1687–1698 How to Cite?
AbstractTransformer-based models have achieved state-of-the-art performance on short-input summarization. However, they still struggle with summarizing longer text. In this paper, we present DYLE, a novel dynamic latent extraction approach for abstractive long-input summarization. DYLE jointly trains an extractor and a generator and treats the extracted text snippets as the latent variable, allowing dynamic snippet-level attention weights during decoding. To provide adequate supervision, we propose simple yet effective heuristics for oracle extraction as well as a consistency loss term, which encourages the extractor to approximate the averaged dynamic weights predicted by the generator. We evaluate our method on different long-document and long-dialogue summarization tasks: GovReport, QMSum, and arXiv. Experiment results show that DYLE outperforms all existing methods on GovReport and QMSum, with gains up to 6.1 ROUGE, while yielding strong results on arXiv. Further analysis shows that the proposed dynamic weights provide interpretability of our generation process.
Persistent Identifierhttp://hdl.handle.net/10722/320030

 

DC FieldValueLanguage
dc.contributor.authorMao, Z-
dc.contributor.authorWu, C-
dc.contributor.authorNi, A-
dc.contributor.authorZhang, Y-
dc.contributor.authorZhang, R-
dc.contributor.authorYu, T-
dc.contributor.authorDeb, B-
dc.contributor.authorZhu, C-
dc.contributor.authorAwadallah, A-
dc.contributor.authorRadev, D-
dc.date.accessioned2022-10-14T05:24:10Z-
dc.date.available2022-10-14T05:24:10Z-
dc.date.issued2022-
dc.identifier.citationThe 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the 60th Annual Meeting of ACL, p. 1687–1698-
dc.identifier.urihttp://hdl.handle.net/10722/320030-
dc.description.abstractTransformer-based models have achieved state-of-the-art performance on short-input summarization. However, they still struggle with summarizing longer text. In this paper, we present DYLE, a novel dynamic latent extraction approach for abstractive long-input summarization. DYLE jointly trains an extractor and a generator and treats the extracted text snippets as the latent variable, allowing dynamic snippet-level attention weights during decoding. To provide adequate supervision, we propose simple yet effective heuristics for oracle extraction as well as a consistency loss term, which encourages the extractor to approximate the averaged dynamic weights predicted by the generator. We evaluate our method on different long-document and long-dialogue summarization tasks: GovReport, QMSum, and arXiv. Experiment results show that DYLE outperforms all existing methods on GovReport and QMSum, with gains up to 6.1 ROUGE, while yielding strong results on arXiv. Further analysis shows that the proposed dynamic weights provide interpretability of our generation process.-
dc.languageeng-
dc.publisherAssociation for Computational Linguistics. -
dc.relation.ispartofThe 60th Annual Meeting of the Association for Computational Linguistics-
dc.titleDYLE: Dynamic Latent Extraction for Abstractive Long-Input Summarization-
dc.typeConference_Paper-
dc.identifier.emailYu, T: taoyds@hku.hk-
dc.identifier.authorityYu, T=rp02864-
dc.identifier.hkuros339276-
dc.identifier.volumeProceedings of the 60th Annual Meeting of ACL-
dc.identifier.spage1687–1698-
dc.identifier.epage1687–1698-
dc.publisher.placeDublin, Ireland-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats