File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Device Variation-Aware Adaptive Quantization for MRAM-based Accurate In-Memory Computing Without On-chip Training

TitleDevice Variation-Aware Adaptive Quantization for MRAM-based Accurate In-Memory Computing Without On-chip Training
Authors
Issue Date3-Dec-2022
PublisherIEEE
Abstract

Hardware-accelerated artificial intelligence with emerging nonvolatile memory such as spin-transfer torque-magneto-resistive random-access memory (STT-MRAM) is pushing both the algorithm and hardware to their design limits. The restrictions for analog-based in-memory computing (IMC) include the device variation, IR drop effect due to low resistance of STT-MRAM and read disturbance in the memory array at the advanced technology node. On-chip hybrid training can recover the inference accuracy but at the cost of many training epochs, reducing the available lifetime for updating cycles needed for on-chip inference. In this work, we show the unique feature of device variations in the foundry STT-MRAM array and propose a software-hardware cross-layer co-design scheme for STT-MRAM IMC. By sensing device level variations, we can leverage them for more conductance levels to adaptively quantize the deep neural networks (DNNs). This device variation-aware adaptive quantization (DVAQ) scheme enables a DNN inference accuracy comparable to on-chip hybrid training without on-chip training. Besides, this DVAQ scheme greatly reduces IR drop effects. Overall, the DVAQ allows one to achieve less than a 1% accuracy drop compared with in-situ training under 40 % device variation/noise without on-chip training in several DNN applications.


Persistent Identifierhttp://hdl.handle.net/10722/340343
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorXiao, Zhihua-
dc.contributor.authorNaik, Vinayak Bharat-
dc.contributor.authorCheung, Shun Kong-
dc.contributor.authorLim, Jia Hao-
dc.contributor.authorKwon, Jae-Hyun-
dc.contributor.authorRen, Zheyu-
dc.contributor.authorWang, Zhongrui-
dc.contributor.authorShao, Qiming -
dc.date.accessioned2024-03-11T10:43:28Z-
dc.date.available2024-03-11T10:43:28Z-
dc.date.issued2022-12-03-
dc.identifier.urihttp://hdl.handle.net/10722/340343-
dc.description.abstract<p>Hardware-accelerated artificial intelligence with emerging nonvolatile memory such as spin-transfer torque-magneto-resistive random-access memory (STT-MRAM) is pushing both the algorithm and hardware to their design limits. The restrictions for analog-based in-memory computing (IMC) include the device variation, IR drop effect due to low resistance of STT-MRAM and read disturbance in the memory array at the advanced technology node. On-chip hybrid training can recover the inference accuracy but at the cost of many training epochs, reducing the available lifetime for updating cycles needed for on-chip inference. In this work, we show the unique feature of device variations in the foundry STT-MRAM array and propose a software-hardware cross-layer co-design scheme for STT-MRAM IMC. By sensing device level variations, we can leverage them for more conductance levels to adaptively quantize the deep neural networks (DNNs). This device variation-aware adaptive quantization (DVAQ) scheme enables a DNN inference accuracy comparable to on-chip hybrid training without on-chip training. Besides, this DVAQ scheme greatly reduces IR drop effects. Overall, the DVAQ allows one to achieve less than a 1% accuracy drop compared with in-situ training under 40 % device variation/noise without on-chip training in several DNN applications.<br></p>-
dc.languageeng-
dc.publisherIEEE-
dc.relation.ispartof2022 IEEE International Electron Devices Meeting (IEDM) (03/12/2022-07/12/2022, San Francisco, CA, USA)-
dc.titleDevice Variation-Aware Adaptive Quantization for MRAM-based Accurate In-Memory Computing Without On-chip Training-
dc.typeConference_Paper-
dc.identifier.doi10.1109/IEDM45625.2022.10019482-
dc.identifier.scopuseid_2-s2.0-85147504160-
dc.identifier.volume2022-December-
dc.identifier.spage1051-
dc.identifier.epage1054-
dc.identifier.isiWOS:000968800700138-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats