Abu Sebastian
IEDM 2023
Analog Non-Volatile Memory-based accelerators offer high-throughput and energy-efficient Multiply-Accumulate operations for the large Fully-Connected layers that dominate Transformer-based Large Language Models. We describe architectural, wafer-scale testing, chip-demo, and hardware-aware training efforts towards such accelerators, and quantify the unique raw-throughput and latency benefits of Fully- (rather than Partially-) Weight-Stationary systems.
Abu Sebastian
IEDM 2023
Gentiana Rashiti, Kumudu Geethan Karunaratne, et al.
ECAI 2024
Karthik Swaminathan, Martin Cochet, et al.
ISCA 2025
Corey Lammie, Yuxuan Wang, et al.
IEEE TETC