File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Learning domain representation for multi-domain sentiment classification

TitleLearning domain representation for multi-domain sentiment classification
Authors
Issue Date2018
PublisherAssociation for Computational Linguistics
Citation
2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL HLT 2018), New Orleans, 1-6 June 2018. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, v. 1, p. 541-550 How to Cite?
AbstractTraining data for sentiment analysis are abundant in multiple domains, yet scarce for other domains. It is useful to leveraging data available for all existing domains to enhance performance on different domains.We investigate this problem by learning domain-specific representations of input sentences using neural network. In particular, a descriptor vector is learned for representing each domain, which is used to map adversarially trained domaingeneral Bi-LSTM input representations into domain-specific representations. Based on this model, we further expand the input representation with exemplary domain knowledge, collected by attending over a memory network of domain training data. Results show that our model outperforms existing methods on multidomain sentiment analysis significantly, giving the best accuracies on two different benchmarks.
Persistent Identifierhttp://hdl.handle.net/10722/321851
ISBN

 

DC FieldValueLanguage
dc.contributor.authorLiu, Qi-
dc.contributor.authorZhang, Yue-
dc.contributor.authorLiu, Jiangming-
dc.date.accessioned2022-11-03T02:21:52Z-
dc.date.available2022-11-03T02:21:52Z-
dc.date.issued2018-
dc.identifier.citation2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL HLT 2018), New Orleans, 1-6 June 2018. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, v. 1, p. 541-550-
dc.identifier.isbn9781948087278-
dc.identifier.urihttp://hdl.handle.net/10722/321851-
dc.description.abstractTraining data for sentiment analysis are abundant in multiple domains, yet scarce for other domains. It is useful to leveraging data available for all existing domains to enhance performance on different domains.We investigate this problem by learning domain-specific representations of input sentences using neural network. In particular, a descriptor vector is learned for representing each domain, which is used to map adversarially trained domaingeneral Bi-LSTM input representations into domain-specific representations. Based on this model, we further expand the input representation with exemplary domain knowledge, collected by attending over a memory network of domain training data. Results show that our model outperforms existing methods on multidomain sentiment analysis significantly, giving the best accuracies on two different benchmarks.-
dc.languageeng-
dc.publisherAssociation for Computational Linguistics-
dc.relation.ispartofProceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies-
dc.titleLearning domain representation for multi-domain sentiment classification-
dc.typeConference_Paper-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.doi10.18653/v1/N18-1050-
dc.identifier.scopuseid_2-s2.0-85068320344-
dc.identifier.volume1-
dc.identifier.spage541-
dc.identifier.epage550-
dc.publisher.placeStroudsburg, PA-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats