File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Self-Adaptive In-Context Learning: An Information Compression Perspective for In-Context Example Selection and Ordering
Title | Self-Adaptive In-Context Learning: An Information Compression Perspective for In-Context Example Selection and Ordering |
---|---|
Authors | |
Issue Date | 1-Jul-2023 |
Abstract | Despite the surprising few-shot performance of in-context learning (ICL), it is still a common practice to randomly sample examples to serve as context. This paper advocates a new principle for ICL: self-adaptive in-context learning. The self-adaption mechanism is introduced to help each sample find an in-context example organization (i.e., selection and permutation) that can derive the correct prediction, thus maximizing performance. To validate the effectiveness of self-adaptive ICL, we propose a general select-then-rank framework and instantiate it with new selection and ranking algorithms. Upon extensive evaluation on eight different NLP datasets, our self-adaptive ICL method achieves a 40% relative improvement over the common practice setting. Further analysis reveals the enormous potential of self-adaptive ICL that it might be able to close the gap between ICL and finetuning given more advanced algorithms. Our code will be released to facilitate future research. |
Persistent Identifier | http://hdl.handle.net/10722/333769 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wu, Zhiyong | - |
dc.contributor.author | Wang, Yaoxiang | - |
dc.contributor.author | Ye, Jiacheng | - |
dc.contributor.author | Kong, Lingpeng | - |
dc.date.accessioned | 2023-10-06T08:38:55Z | - |
dc.date.available | 2023-10-06T08:38:55Z | - |
dc.date.issued | 2023-07-01 | - |
dc.identifier.uri | http://hdl.handle.net/10722/333769 | - |
dc.description.abstract | <p>Despite the surprising few-shot performance of in-context learning (ICL), it is still a common practice to randomly sample examples to serve as context. This paper advocates a new principle for ICL: self-adaptive in-context learning. The self-adaption mechanism is introduced to help each sample find an in-context example organization (i.e., selection and permutation) that can derive the correct prediction, thus maximizing performance. To validate the effectiveness of self-adaptive ICL, we propose a general select-then-rank framework and instantiate it with new selection and ranking algorithms. Upon extensive evaluation on eight different NLP datasets, our self-adaptive ICL method achieves a 40% relative improvement over the common practice setting. Further analysis reveals the enormous potential of self-adaptive ICL that it might be able to close the gap between ICL and finetuning given more advanced algorithms. Our code will be released to facilitate future research.</p> | - |
dc.language | eng | - |
dc.relation.ispartof | The 61st Annual Meeting of the Association for Computational Linguistics (09/07/2023-14/07/2023, Toronto, Canada) | - |
dc.title | Self-Adaptive In-Context Learning: An Information Compression Perspective for In-Context Example Selection and Ordering | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.18653/v1/2023.acl-long.79 | - |
dc.identifier.volume | 1 | - |
dc.identifier.spage | 1423 | - |
dc.identifier.epage | 1436 | - |