File Download

There are no files associated with this item.

Supplementary

Conference Paper: NoiseZO: RRAM Noise-Driven Zeroth-Order Optimization for Efficient Forward-Only Training

TitleNoiseZO: RRAM Noise-Driven Zeroth-Order Optimization for Efficient Forward-Only Training
Authors
Issue Date24-Jun-2025
Abstract

Compute-in-memory using emerging resistive random-access memory (RRAM) demonstrates significant potential for building energy-efficient deep neural networks. However, RRAM-based network training faces challenges from computational noise and gradient calculation overhead. In this study, we introduce NoiseZO, a forward-only training framework that leverages intrinsic RRAM noise to estimate gradients via zeroth-order (ZO) optimization. The framework maps neural networks onto dual RRAM arrays, utilizing their inherent write noise as ZO perturbations for training. This enables network updates through only two forward computations. A fine-grained perturbation control strategy is further developed to enhance training accuracy. Extensive experiments on vowel and image datasets, implemented with typical networks, showcase the effectiveness of our framework. Compared to conventional complementary metal-oxide-semiconductor (CMOS) implementations, our approach achieves a 21-fold reduction in energy consumption.


Persistent Identifierhttp://hdl.handle.net/10722/359699

 

DC FieldValueLanguage
dc.contributor.authorWang, Shuqi-
dc.contributor.authorLiu, Zhengwu-
dc.contributor.authorDing, Chenchen-
dc.contributor.authorZHANG, Chen-
dc.contributor.authorWu, Taiqiang-
dc.contributor.authorZhou, Jiajun-
dc.contributor.authorWong, Ngai-
dc.date.accessioned2025-09-10T00:30:54Z-
dc.date.available2025-09-10T00:30:54Z-
dc.date.issued2025-06-24-
dc.identifier.urihttp://hdl.handle.net/10722/359699-
dc.description.abstract<p>Compute-in-memory using emerging resistive random-access memory (RRAM) demonstrates significant potential for building energy-efficient deep neural networks. However, RRAM-based network training faces challenges from computational noise and gradient calculation overhead. In this study, we introduce NoiseZO, a forward-only training framework that leverages intrinsic RRAM noise to estimate gradients via zeroth-order (ZO) optimization. The framework maps neural networks onto dual RRAM arrays, utilizing their inherent write noise as ZO perturbations for training. This enables network updates through only two forward computations. A fine-grained perturbation control strategy is further developed to enhance training accuracy. Extensive experiments on vowel and image datasets, implemented with typical networks, showcase the effectiveness of our framework. Compared to conventional complementary metal-oxide-semiconductor (CMOS) implementations, our approach achieves a 21-fold reduction in energy consumption.</p>-
dc.languageeng-
dc.relation.ispartofACM/IEEE Design Automation Conference (DAC) (09/07/2023-13/07/2023, San Francisco)-
dc.titleNoiseZO: RRAM Noise-Driven Zeroth-Order Optimization for Efficient Forward-Only Training-
dc.typeConference_Paper-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats