File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Loss-Balanced task weighting to reduce negative transfer in multi-task learning

TitleLoss-Balanced task weighting to reduce negative transfer in multi-task learning
Authors
Issue Date2019
Citation
33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, 2019, p. 9977-9978 How to Cite?
AbstractIn settings with related prediction tasks, integrated multi-task learning models can often improve performance relative to independent single-task models. However, even when the average task performance improves, individual tasks may experience negative transfer in which the multi-task model's predictions are worse than the single-task model's. We show the prevalence of negative transfer in a computational chemistry case study with 128 tasks and introduce a framework that provides a foundation for reducing negative transfer in multitask models. Our Loss-Balanced Task Weighting approach dynamically updates task weights during model training to control the influence of individual tasks.
Persistent Identifierhttp://hdl.handle.net/10722/341278

 

DC FieldValueLanguage
dc.contributor.authorLiu, Shengchao-
dc.contributor.authorLiang, Yingyu-
dc.contributor.authorGitter, Anthony-
dc.date.accessioned2024-03-13T08:41:33Z-
dc.date.available2024-03-13T08:41:33Z-
dc.date.issued2019-
dc.identifier.citation33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, 2019, p. 9977-9978-
dc.identifier.urihttp://hdl.handle.net/10722/341278-
dc.description.abstractIn settings with related prediction tasks, integrated multi-task learning models can often improve performance relative to independent single-task models. However, even when the average task performance improves, individual tasks may experience negative transfer in which the multi-task model's predictions are worse than the single-task model's. We show the prevalence of negative transfer in a computational chemistry case study with 128 tasks and introduce a framework that provides a foundation for reducing negative transfer in multitask models. Our Loss-Balanced Task Weighting approach dynamically updates task weights during model training to control the influence of individual tasks.-
dc.languageeng-
dc.relation.ispartof33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019-
dc.titleLoss-Balanced task weighting to reduce negative transfer in multi-task learning-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85087287690-
dc.identifier.spage9977-
dc.identifier.epage9978-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats