File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Efficient Attention via Control Variates

TitleEfficient Attention via Control Variates
Authors
Issue Date1-May-2023
Abstract

Random-feature-based attention (RFA) is an efficient approximation of softmax attention with linear runtime and space complexity. However, the approximation gap between RFA and conventional softmax attention is not well studied. Built upon previous progress of RFA, we characterize this gap through the lens of control variates and show that RFA can be decomposed into a sum of multiple control variate estimators for each element in the sequence. This new framework reveals that exact softmax attention can be recovered from RFA by manipulating each control variate. Besides, it allows us to develop a more flexible form of control variates, resulting in a novel attention mechanism that significantly reduces the approximation gap while maintaining linear complexity. Extensive experiments demonstrate that our model outperforms state-of-the-art efficient attention mechanisms on both vision and language tasks.


Persistent Identifierhttp://hdl.handle.net/10722/333817

 

DC FieldValueLanguage
dc.contributor.authorZheng, Lin-
dc.contributor.authorYuan, Jianbo-
dc.contributor.authorWang, Chong-
dc.contributor.authorKong, Lingpeng-
dc.date.accessioned2023-10-06T08:39:19Z-
dc.date.available2023-10-06T08:39:19Z-
dc.date.issued2023-05-01-
dc.identifier.urihttp://hdl.handle.net/10722/333817-
dc.description.abstract<p>Random-feature-based attention (RFA) is an efficient approximation of softmax attention with linear runtime and space complexity. However, the approximation gap between RFA and conventional softmax attention is not well studied. Built upon previous progress of RFA, we characterize this gap through the lens of control variates and show that RFA can be decomposed into a sum of multiple control variate estimators for each element in the sequence. This new framework reveals that exact softmax attention can be recovered from RFA by manipulating each control variate. Besides, it allows us to develop a more flexible form of control variates, resulting in a novel attention mechanism that significantly reduces the approximation gap while maintaining linear complexity. Extensive experiments demonstrate that our model outperforms state-of-the-art efficient attention mechanisms on both vision and language tasks.<br></p>-
dc.languageeng-
dc.relation.ispartofInternational Conference on Learning Representations (ICLR 2023) (01/05/2023-05/05/2023, Kigali, Rwanda)-
dc.titleEfficient Attention via Control Variates-
dc.typeConference_Paper-
dc.identifier.doi10.48550/arXiv.2302.04542-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats