File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Sentence-state LSTM for text representation

TitleSentence-state LSTM for text representation
Authors
Issue Date2018
Citation
ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), 2018, v. 1, p. 317-327 How to Cite?
AbstractBi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers.
Persistent Identifierhttp://hdl.handle.net/10722/321838
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZhang, Yue-
dc.contributor.authorLiu, Qi-
dc.contributor.authorSong, Linfeng-
dc.date.accessioned2022-11-03T02:21:47Z-
dc.date.available2022-11-03T02:21:47Z-
dc.date.issued2018-
dc.identifier.citationACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), 2018, v. 1, p. 317-327-
dc.identifier.urihttp://hdl.handle.net/10722/321838-
dc.description.abstractBi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers.-
dc.languageeng-
dc.relation.ispartofACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)-
dc.titleSentence-state LSTM for text representation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.18653/v1/p18-1030-
dc.identifier.scopuseid_2-s2.0-85063075441-
dc.identifier.volume1-
dc.identifier.spage317-
dc.identifier.epage327-
dc.identifier.isiWOS:000493904300030-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats