File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: TOWARDS FAST ADAPTATION OF NEURAL ARCHITECTURES WITH META LEARNING

TitleTOWARDS FAST ADAPTATION OF NEURAL ARCHITECTURES WITH META LEARNING
Authors
Issue Date2020
Citation
8th International Conference on Learning Representations, ICLR 2020, 2020 How to Cite?
AbstractRecently, Neural Architecture Search (NAS) has been successfully applied to multiple artificial intelligence areas and shows better performance compared with hand-designed networks. However, the existing NAS methods only target a specific task. Most of them usually do well in searching an architecture for single task but are troublesome for multiple datasets or multiple tasks. Generally, the architecture for a new task is either searched from scratch, which is neither efficient nor flexible enough for practical application scenarios, or borrowed from the ones searched on other tasks, which might be not optimal. In order to tackle the transferability of NAS and conduct fast adaptation of neural architectures, we propose a novel Transferable Neural Architecture Search method based on meta-learning in this paper, which is termed as T-NAS. T-NAS learns a meta-architecture that is able to adapt to a new task quickly through a few gradient steps, which makes the transferred architecture suitable for the specific task. Extensive experiments show that T-NAS achieves state-of-the-art performance in few-shot learning and comparable performance in supervised learning but with 50x less searching cost, which demonstrates the effectiveness of our method.
Persistent Identifierhttp://hdl.handle.net/10722/345313

 

DC FieldValueLanguage
dc.contributor.authorLian, Dongze-
dc.contributor.authorZheng, Yin-
dc.contributor.authorXu, Yintao-
dc.contributor.authorLu, Yanxiong-
dc.contributor.authorLin, Leyu-
dc.contributor.authorZhao, Peilin-
dc.contributor.authorHuang, Junzhou-
dc.contributor.authorGao, Shenghua-
dc.date.accessioned2024-08-15T09:26:33Z-
dc.date.available2024-08-15T09:26:33Z-
dc.date.issued2020-
dc.identifier.citation8th International Conference on Learning Representations, ICLR 2020, 2020-
dc.identifier.urihttp://hdl.handle.net/10722/345313-
dc.description.abstractRecently, Neural Architecture Search (NAS) has been successfully applied to multiple artificial intelligence areas and shows better performance compared with hand-designed networks. However, the existing NAS methods only target a specific task. Most of them usually do well in searching an architecture for single task but are troublesome for multiple datasets or multiple tasks. Generally, the architecture for a new task is either searched from scratch, which is neither efficient nor flexible enough for practical application scenarios, or borrowed from the ones searched on other tasks, which might be not optimal. In order to tackle the transferability of NAS and conduct fast adaptation of neural architectures, we propose a novel Transferable Neural Architecture Search method based on meta-learning in this paper, which is termed as T-NAS. T-NAS learns a meta-architecture that is able to adapt to a new task quickly through a few gradient steps, which makes the transferred architecture suitable for the specific task. Extensive experiments show that T-NAS achieves state-of-the-art performance in few-shot learning and comparable performance in supervised learning but with 50x less searching cost, which demonstrates the effectiveness of our method.-
dc.languageeng-
dc.relation.ispartof8th International Conference on Learning Representations, ICLR 2020-
dc.titleTOWARDS FAST ADAPTATION OF NEURAL ARCHITECTURES WITH META LEARNING-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85150658364-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats