File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1002/adfm.202100042
- Scopus: eid_2-s2.0-85104332960
- WOS: WOS:000640921600001
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: One Transistor One Electrolyte‐Gated Transistor Based Spiking Neural Network for Power‐Efficient Neuromorphic Computing System
Title | One Transistor One Electrolyte‐Gated Transistor Based Spiking Neural Network for Power‐Efficient Neuromorphic Computing System |
---|---|
Authors | |
Keywords | associative memory electrolyte-gated transistors ion intercalation neuromorphic computing spiking neural networks |
Issue Date | 2021 |
Publisher | Wiley-VCH Verlag GmbH & Co KGaA. The Journal's web site is located at http://www.wiley-vch.de/home/afm |
Citation | Advanced Functional Materials, 2021, v. 31 n. 26, p. article no. 2100042 How to Cite? |
Abstract | Neuromorphic computing powered by spiking neural networks (SNN) provides a powerful and efficient information processing paradigm. To harvest the advantage of SNNs, compact and low-power synapses that can reliably practice local learning rules are required, posing significant challenges to the conventional silicon-based platform in terms of area- and energy-efficiency, as well as computing throughput. Here, electrolyte-gated transistors (EGTs) paired with transistors are employed to implement power-efficient neuromorphic computing systems. The one-transistor-one-EGT (1T1E) synapse not only alleviates the self-discharging of EGT but also provides a flexible and efficient way to practice the important spike-timing-dependent plasticity learning rule. Based on that, an SNN with a temporal coding scheme is implemented for associative memory that can learn and recover images of handwritten digits with high robustness. Thanks to the temporal coding scheme and low operation current of EGTs, the energy-efficiency of 1T1E-based SNN is ≈30× lower than that of the prevalent rate coding scheme, and the peak performance is estimated to be 2 pJ/SOP (picojoule per synaptic operation) at the training phase and 80 TOPs−1 W−1 (tera operations per second per watt) at inference phase, respectively. These results pave the way for power-efficient neuromorphic computing systems with wide applications for edge computing. |
Persistent Identifier | http://hdl.handle.net/10722/306223 |
ISSN | 2023 Impact Factor: 18.5 2023 SCImago Journal Rankings: 5.496 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, Y | - |
dc.contributor.author | Xuan, Z | - |
dc.contributor.author | Lu, J | - |
dc.contributor.author | Wang, Z | - |
dc.contributor.author | Zhang, X | - |
dc.contributor.author | Wu, Z | - |
dc.contributor.author | Wang, Y | - |
dc.contributor.author | Xu, H | - |
dc.contributor.author | Dou, C | - |
dc.contributor.author | Kang, Y | - |
dc.contributor.author | Liu, Q | - |
dc.contributor.author | Lv, H | - |
dc.contributor.author | Shang, D | - |
dc.date.accessioned | 2021-10-20T10:20:33Z | - |
dc.date.available | 2021-10-20T10:20:33Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Advanced Functional Materials, 2021, v. 31 n. 26, p. article no. 2100042 | - |
dc.identifier.issn | 1616-301X | - |
dc.identifier.uri | http://hdl.handle.net/10722/306223 | - |
dc.description.abstract | Neuromorphic computing powered by spiking neural networks (SNN) provides a powerful and efficient information processing paradigm. To harvest the advantage of SNNs, compact and low-power synapses that can reliably practice local learning rules are required, posing significant challenges to the conventional silicon-based platform in terms of area- and energy-efficiency, as well as computing throughput. Here, electrolyte-gated transistors (EGTs) paired with transistors are employed to implement power-efficient neuromorphic computing systems. The one-transistor-one-EGT (1T1E) synapse not only alleviates the self-discharging of EGT but also provides a flexible and efficient way to practice the important spike-timing-dependent plasticity learning rule. Based on that, an SNN with a temporal coding scheme is implemented for associative memory that can learn and recover images of handwritten digits with high robustness. Thanks to the temporal coding scheme and low operation current of EGTs, the energy-efficiency of 1T1E-based SNN is ≈30× lower than that of the prevalent rate coding scheme, and the peak performance is estimated to be 2 pJ/SOP (picojoule per synaptic operation) at the training phase and 80 TOPs−1 W−1 (tera operations per second per watt) at inference phase, respectively. These results pave the way for power-efficient neuromorphic computing systems with wide applications for edge computing. | - |
dc.language | eng | - |
dc.publisher | Wiley-VCH Verlag GmbH & Co KGaA. The Journal's web site is located at http://www.wiley-vch.de/home/afm | - |
dc.relation.ispartof | Advanced Functional Materials | - |
dc.rights | Submitted (preprint) Version This is the pre-peer reviewed version of the following article: [FULL CITE], which has been published in final form at [Link to final article using the DOI]. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions. Accepted (peer-reviewed) Version This is the peer reviewed version of the following article: [FULL CITE], which has been published in final form at [Link to final article using the DOI]. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions. | - |
dc.subject | associative memory | - |
dc.subject | electrolyte-gated transistors | - |
dc.subject | ion intercalation | - |
dc.subject | neuromorphic computing | - |
dc.subject | spiking neural networks | - |
dc.title | One Transistor One Electrolyte‐Gated Transistor Based Spiking Neural Network for Power‐Efficient Neuromorphic Computing System | - |
dc.type | Article | - |
dc.identifier.email | Wang, Z: zrwang@eee.hku.hk | - |
dc.identifier.authority | Wang, Z=rp02714 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1002/adfm.202100042 | - |
dc.identifier.scopus | eid_2-s2.0-85104332960 | - |
dc.identifier.hkuros | 327772 | - |
dc.identifier.volume | 31 | - |
dc.identifier.issue | 26 | - |
dc.identifier.spage | article no. 2100042 | - |
dc.identifier.epage | article no. 2100042 | - |
dc.identifier.isi | WOS:000640921600001 | - |
dc.publisher.place | Germany | - |