File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: NoiseZO: RRAM Noise-Driven Zeroth-Order Optimization for Efficient Forward-Only Training
| Title | NoiseZO: RRAM Noise-Driven Zeroth-Order Optimization for Efficient Forward-Only Training |
|---|---|
| Authors | |
| Issue Date | 24-Jun-2025 |
| Abstract | Compute-in-memory using emerging resistive random-access memory (RRAM) demonstrates significant potential for building energy-efficient deep neural networks. However, RRAM-based network training faces challenges from computational noise and gradient calculation overhead. In this study, we introduce NoiseZO, a forward-only training framework that leverages intrinsic RRAM noise to estimate gradients via zeroth-order (ZO) optimization. The framework maps neural networks onto dual RRAM arrays, utilizing their inherent write noise as ZO perturbations for training. This enables network updates through only two forward computations. A fine-grained perturbation control strategy is further developed to enhance training accuracy. Extensive experiments on vowel and image datasets, implemented with typical networks, showcase the effectiveness of our framework. Compared to conventional complementary metal-oxide-semiconductor (CMOS) implementations, our approach achieves a 21-fold reduction in energy consumption. |
| Persistent Identifier | http://hdl.handle.net/10722/359699 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Wang, Shuqi | - |
| dc.contributor.author | Liu, Zhengwu | - |
| dc.contributor.author | Ding, Chenchen | - |
| dc.contributor.author | ZHANG, Chen | - |
| dc.contributor.author | Wu, Taiqiang | - |
| dc.contributor.author | Zhou, Jiajun | - |
| dc.contributor.author | Wong, Ngai | - |
| dc.date.accessioned | 2025-09-10T00:30:54Z | - |
| dc.date.available | 2025-09-10T00:30:54Z | - |
| dc.date.issued | 2025-06-24 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/359699 | - |
| dc.description.abstract | <p>Compute-in-memory using emerging resistive random-access memory (RRAM) demonstrates significant potential for building energy-efficient deep neural networks. However, RRAM-based network training faces challenges from computational noise and gradient calculation overhead. In this study, we introduce NoiseZO, a forward-only training framework that leverages intrinsic RRAM noise to estimate gradients via zeroth-order (ZO) optimization. The framework maps neural networks onto dual RRAM arrays, utilizing their inherent write noise as ZO perturbations for training. This enables network updates through only two forward computations. A fine-grained perturbation control strategy is further developed to enhance training accuracy. Extensive experiments on vowel and image datasets, implemented with typical networks, showcase the effectiveness of our framework. Compared to conventional complementary metal-oxide-semiconductor (CMOS) implementations, our approach achieves a 21-fold reduction in energy consumption.</p> | - |
| dc.language | eng | - |
| dc.relation.ispartof | ACM/IEEE Design Automation Conference (DAC) (09/07/2023-13/07/2023, San Francisco) | - |
| dc.title | NoiseZO: RRAM Noise-Driven Zeroth-Order Optimization for Efficient Forward-Only Training | - |
| dc.type | Conference_Paper | - |
