File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: BATMANN: A Binarized-All-Through Memory-Augmented Neural Network for Efficient In-Memory Computing

TitleBATMANN: A Binarized-All-Through Memory-Augmented Neural Network for Efficient In-Memory Computing
Authors
KeywordsRRAM
memory augmented
binary
neural networks
in-memory computing
Issue Date2021
PublisherIEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/conhome/1000054/all-proceedings
Citation
2021 IEEE 14th International Conference on ASIC (ASICON 2021), Kunming, China, 26-29 October 2021, p. 1-4 How to Cite?
AbstractThe traditional von Neumann architecture suffers from heavy data traffic between processing and memory units, which incurs high power and latency. To cope with the booming use of neural networks on edge devices, a promising way is to perform in-memory computing through exploiting the next-generation memristive devices. This work proposes a 2-level resistive random-access memory (RRAM)-based memory-augmented neural network (MANN), named binarized-all-through MANN (BATMANN), that is end-to-end trainable and allows both the controller and memory to be seamlessly integrated onto RRAM crossbars. Experiments then show the superiority of BATMANN in doing few-shot learning with high accuracy and robustness.
DescriptionSession B3 : Computing-in/near-Memory II - no. 0359
Persistent Identifierhttp://hdl.handle.net/10722/308265
ISSN
2020 SCImago Journal Rankings: 0.125

 

DC FieldValueLanguage
dc.contributor.authorREN, Y-
dc.contributor.authorLIN, R-
dc.contributor.authorRAN, J-
dc.contributor.authorLIU, C-
dc.contributor.authorTAO, C-
dc.contributor.authorWang, Z-
dc.contributor.authorLi, C-
dc.contributor.authorWong, N-
dc.date.accessioned2021-11-12T13:44:49Z-
dc.date.available2021-11-12T13:44:49Z-
dc.date.issued2021-
dc.identifier.citation2021 IEEE 14th International Conference on ASIC (ASICON 2021), Kunming, China, 26-29 October 2021, p. 1-4-
dc.identifier.issn2162-7541-
dc.identifier.urihttp://hdl.handle.net/10722/308265-
dc.descriptionSession B3 : Computing-in/near-Memory II - no. 0359-
dc.description.abstractThe traditional von Neumann architecture suffers from heavy data traffic between processing and memory units, which incurs high power and latency. To cope with the booming use of neural networks on edge devices, a promising way is to perform in-memory computing through exploiting the next-generation memristive devices. This work proposes a 2-level resistive random-access memory (RRAM)-based memory-augmented neural network (MANN), named binarized-all-through MANN (BATMANN), that is end-to-end trainable and allows both the controller and memory to be seamlessly integrated onto RRAM crossbars. Experiments then show the superiority of BATMANN in doing few-shot learning with high accuracy and robustness.-
dc.languageeng-
dc.publisherIEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/conhome/1000054/all-proceedings-
dc.relation.ispartofIEEE International Conference on ASIC Proceedings-
dc.rightsIEEE International Conference on ASIC Proceedings. Copyright © IEEE.-
dc.rights©2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.subjectRRAM-
dc.subjectmemory augmented-
dc.subjectbinary-
dc.subjectneural networks-
dc.subjectin-memory computing-
dc.titleBATMANN: A Binarized-All-Through Memory-Augmented Neural Network for Efficient In-Memory Computing-
dc.typeConference_Paper-
dc.identifier.emailWang, Z: zrwang@eee.hku.hk-
dc.identifier.emailLi, C: canl@hku.hk-
dc.identifier.emailWong, N: nwong@eee.hku.hk-
dc.identifier.authorityWang, Z=rp02714-
dc.identifier.authorityLi, C=rp02706-
dc.identifier.authorityWong, N=rp00190-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ASICON52560.2021.9620292-
dc.identifier.scopuseid_2-s2.0-85122852581-
dc.identifier.hkuros329309-
dc.identifier.spage1-
dc.identifier.epage4-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats