File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Book Chapter: Stochastic Emerging Resistive Memories for Unconventional Computing

TitleStochastic Emerging Resistive Memories for Unconventional Computing
Authors
Issue Date9-Oct-2023
PublisherRoyal Society of Chemistry
Abstract

Stochasticity plays a critical role in biological neural systems, which also inspires various statistical learning approaches. However, conventional digital electronics on silicon-based transistors practice deterministic Boolean logic, making it less favorable for solving problems involving stochasticity. This is further intensified by the von Neumann bottleneck of digital systems and the slowdowns of Moore’s law. Emerging resistive memory, such as those based on redox reactions and phase transitions, features intrinsic stochasticity due to their underlying physical mechanisms. In addition, such devices integrate storage and computing functions, like that of the brain. They are also endowed with superior scalability and stack-ability due to their simple and low-cost structures. In this chapter, we will survey the broad spectrum of unconventional computing applications of stochastic emerging resistive memories (RMs) from their physics origin to system-level applications. Firstly, we review the mainstream resistive memories and the origin of stochasticity in both programming and charge transport. Secondly, we explore how the stochasticity of RMs benefits bio-inspired computing, including artificial neural networks, spiking neural networks, and reservoir computing. Thirdly, we discuss how stochasticity benefits energy-based networks, such as Hopfield networks, in solving optimization problems. Fourthly, we survey the applications to cybersecurity, including how the cycle-to-cycle (C2C) variation is leveraged for random number generation and how the device-to-device (D2D) variation contributes to hardware identities. Last but not least, we introduce RM-based probability bit generation and bit stream decorrelation for probabilistic computing, with applications to Bayesian neural networks and Markov chain Monte Carlo algorithms.


Persistent Identifierhttp://hdl.handle.net/10722/341654
ISBN

 

DC FieldValueLanguage
dc.contributor.authorWang, Dingchen-
dc.contributor.authorShi, Shuhui-
dc.contributor.authorZhang, Yi-
dc.contributor.authorShang, Dashan-
dc.contributor.authorWang, Qing-
dc.contributor.authorYu, Hongyu-
dc.contributor.authorWang, Zhongrui-
dc.date.accessioned2024-03-20T06:58:03Z-
dc.date.available2024-03-20T06:58:03Z-
dc.date.issued2023-10-09-
dc.identifier.isbn9781839165696-
dc.identifier.urihttp://hdl.handle.net/10722/341654-
dc.description.abstract<p>Stochasticity plays a critical role in biological neural systems, which also inspires various statistical learning approaches. However, conventional digital electronics on silicon-based transistors practice deterministic Boolean logic, making it less favorable for solving problems involving stochasticity. This is further intensified by the von Neumann bottleneck of digital systems and the slowdowns of Moore’s law. Emerging resistive memory, such as those based on redox reactions and phase transitions, features intrinsic stochasticity due to their underlying physical mechanisms. In addition, such devices integrate storage and computing functions, like that of the brain. They are also endowed with superior scalability and stack-ability due to their simple and low-cost structures. In this chapter, we will survey the broad spectrum of unconventional computing applications of stochastic emerging resistive memories (RMs) from their physics origin to system-level applications. Firstly, we review the mainstream resistive memories and the origin of stochasticity in both programming and charge transport. Secondly, we explore how the stochasticity of RMs benefits bio-inspired computing, including artificial neural networks, spiking neural networks, and reservoir computing. Thirdly, we discuss how stochasticity benefits energy-based networks, such as Hopfield networks, in solving optimization problems. Fourthly, we survey the applications to cybersecurity, including how the cycle-to-cycle (C2C) variation is leveraged for random number generation and how the device-to-device (D2D) variation contributes to hardware identities. Last but not least, we introduce RM-based probability bit generation and bit stream decorrelation for probabilistic computing, with applications to Bayesian neural networks and Markov chain Monte Carlo algorithms.<br></p>-
dc.languageeng-
dc.publisherRoyal Society of Chemistry-
dc.relation.ispartofAdvanced Memory Technology: Functional Materials and Devices-
dc.titleStochastic Emerging Resistive Memories for Unconventional Computing-
dc.typeBook_Chapter-
dc.identifier.doi10.1039/BK9781839169946-00240-
dc.identifier.eisbn9781839169946-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats