With the slowing down of the Moore’s law and fundamental limitations due to the von-Neumann bottleneck, continued improvements in computing hardware performance become increasingly more challenging. Resistive switching (RS) devices are being extensively studied as promising candidates for next generation memory and computing applications due to their fast switching speed, excellent endurance and retention, and scaling and three-dimensional (3D) stacking capability. In particular, RS devices offer the potential to natively emulate the functions and structures of synapses and neurons, allowing them to efficiently implement neural networks (NNs) and other in-memory computing systems for data intensive applications such as machine learning tasks. In this review, we will examine the mechanisms of RS effects and discuss recent progresses in the application of RS devices for memory, deep learning accelerator, and more faithful brain-inspired computing tasks. Challenges and possible solutions at the device, algorithm, and system levels will also be discussed.
The authors thank insightful discussions with Dr. M. A. Zidan and J. Moon. This work was supported by in part by the National Science Foundation through awards CCF-1900675 and DMR-1810119. W. D. L. would like to thank Charlie for his tremendous support during his stay as a postdoc in the Lieber group from 2003-2005, and for his advice that led to the conception of the initial concept of metal-ion based RS devices.
Reprints and Permission requests may be sought directly from editorial office.