Mixed-Signal Artificial Neural Networks Hardware Accelerators
Artificial Neural Networks (ANNs) have achieved remarkable success across various machine learning tasks, including natural language processing, speech recognition, and image classification. However, ANN hardware accelerators typically depend on highly parallel multiply-and-accumulate (MAC) operations, which generate substantial intermediate data. In conventional von Neumann architectures, frequent data transfers between the processing elements and memory result in significant energy inefficiency and latency. These challenges are particularly pronounced in complex models with high bit precision, restricting their deployment in edge devices.
Analog/mixed-signal (AMS) computing has emerged as a promising alternative for implementing ANNs. Analog computation offers higher energy efficiency compared to digital computation in low signal-to-noise ratio (SNR) scenarios. Furthermore, many applications of MAC operations require relatively low precision, making AMS an attractive option for edge AI solutions.

-
J. Mattar, M. M. Dahan, S. Dünkel, H. Mulaosmanovic, S. Beyer, E. Yalon, and N. Wainstein, “Reconfigurable Time-Domain In-Memory Computing Macro using CAM FeFET with Multilevel Delay Calibration in 28 nm CMOS,” arXiv Preprint arXiv, 2025.
- K. Stern, N. Wainstein, Y. Keller, C. Neumann, E. Pop, S. Kvatinsky, and E. Yalon, “Sub-Nanosecond Pulses Enable Partial Reset for Analog Phase Change Memory,” IEEE Electron Device Letters, vol. 42, no. 9, pp. 1291-1294, September 2021. (Paper)
- L. Danial, E. Pikhay, E. Herbelin, N. Wainstein, V. Gupta, N. Wald, Y. Roizin, R. Daniel,
and S. Kvatinsky, “Two-terminal floating-gate transistors with a low-power memristive operation mode for analogue neuromorphic computing,” Nature Electronics, vol. 2, pp. 596-605, December 2019. (Paper)