Publication in conference proceedings
Dendrite-inspired computing to improve resilience of neural networks to faults in emerging memory technologies
Lizy K. John (John, L. K.); Felipe Maia Galvão França (França, F. M. G.); Subhashish Mitra (Mitra, S.); Zachary Susskind (Susskind, Z.); Priscila M. V. Lima (Lima, P. M. V.); Igor D. S. Miranda (Miranda, I. D. S.); Eugene B. John (John, E. B.); Diego L. C. Dutra (Dutra, D. L. C.); Maurício Breternitz (Breternitz Jr., M.); et al.
2023 IEEE International Conference on Rebooting Computing (ICRC)
Year (definitive publication)
2023
Language
English
Country
United States of America
More Information
Web of Science®

This publication is not indexed in Web of Science®

Scopus

Times Cited: 0

(Last checked: 2024-05-08 18:01)

View record in Scopus

Google Scholar

Times Cited: 0

(Last checked: 2024-05-09 17:31)

View record in Google Scholar

Abstract
Mimicking biological neurons by focusing on the excitatory/inhibitory decoding performed by dendritic trees offers an intriguing alternative to the traditional integrate-and-fire McCullogh-Pitts neuron stylization. Weightless Neural Networks (WNN), which rely on value lookups from tables, emulate the integration process in dendrites and have demonstrated notable advantages in terms of energy efficiency. In this paper, we delve into the WNN paradigm from the perspective of reliability and fault tolerance. Through a series of fault injection experiments, we illustrate that WNNs exhibit remarkable resilience to both transient (soft) errors and permanent faults. Notably, WNN models experience minimal deterioration in accuracy even when subjected to fault rates of up to 5%. This resilience makes them well-suited for implementation in emerging memory technologies for binary or multiple bits-per-cell storage with reduced reliance on memory block-level error resilience features. By offering a novel perspective on neural network modeling and highlighting the robustness of WNNs, this research contributes to the broader understanding of fault tolerance in neural networks, particularly in the context of emerging memory technologies.
Acknowledgements
--
Keywords
Funding Records
Funding Reference Funding Entity
#2326894 National Science Foundation (NSF)
UIDB/50008/2020 Fundação para a Ciência e a Tecnologia
DSAIPA/AI/0122/2020 Fundação para a Ciência e a Tecnologia
UIDP/04466/2020 Fundação para a Ciência e a Tecnologia
UIDB/04466/2020 Fundação para a Ciência e a Tecnologia
#C645463824-00000063 Comissão Europeia
#2326895 National Science Foundation (NSF)