Toward Software-Equivalent Accuracy on Transformer-Based Deep Neural Networks With Analog Memory Devices
Recent advances in deep learning have been driven by ever-increasing model sizes, with networks growing to millions or even billions of parameters. Such enormous models call for fast and energy-efficient hardware accelerators. We study the potential of Analog AI accelerators based on Non-Volatile Me...
Main Authors: | Katie Spoon, Hsinyu Tsai, An Chen, Malte J. Rasch, Stefano Ambrogio, Charles Mackin, Andrea Fasoli, Alexander M. Friz, Pritish Narayanan, Milos Stanisavljevic, Geoffrey W. Burr |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2021-07-01
|
Series: | Frontiers in Computational Neuroscience |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fncom.2021.675741/full |
Similar Items
-
ACIMS: Analog CIM Simulator for DNN Resilience
by: Dong Ding, et al.
Published: (2021-03-01) -
Endurance and Retention Degradation of Intermediate Levels in Filamentary Analog RRAM
by: Meiran Zhao, et al.
Published: (2019-01-01) -
Etude de la variabilité des technologies PCM et OxRAM pour leur utilisation en tant que synapses dans les systèmes neuromorphiques
by: Garbin, Daniele
Published: (2015) -
Analog Vector-Matrix Multiplier Based on Programmable Current Mirrors for Neural Network Integrated Circuits
by: Maksym Paliy, et al.
Published: (2020-01-01) -
Training LSTM Networks With Resistive Cross-Point Devices
by: Tayfun Gokmen, et al.
Published: (2018-10-01)