Publicação em atas de evento científico
Dendrite-inspired computing to improve resilience of neural networks to faults in emerging memory technologies
Lizy K. John (John, L. K.); Felipe Maia Galvão França (França, F. M. G.); Subhashish Mitra (Mitra, S.); Zachary Susskind (Susskind, Z.); Priscila M. V. Lima (Lima, P. M. V.); Igor D. S. Miranda (Miranda, I. D. S.); Eugene B. John (John, E. B.); Diego L. C. Dutra (Dutra, D. L. C.); Maurício Breternitz (Breternitz Jr., M.); et al.
2023 IEEE International Conference on Rebooting Computing (ICRC)
Ano (publicação definitiva)
2023
Língua
Inglês
País
Estados Unidos da América
Mais Informação
Web of Science®

Esta publicação não está indexada na Web of Science®

Scopus

N.º de citações: 0

(Última verificação: 2024-04-23 13:21)

Ver o registo na Scopus

Google Scholar

N.º de citações: 0

(Última verificação: 2024-04-27 08:50)

Ver o registo no Google Scholar

Abstract/Resumo
Mimicking biological neurons by focusing on the excitatory/inhibitory decoding performed by dendritic trees offers an intriguing alternative to the traditional integrate-and-fire McCullogh-Pitts neuron stylization. Weightless Neural Networks (WNN), which rely on value lookups from tables, emulate the integration process in dendrites and have demonstrated notable advantages in terms of energy efficiency. In this paper, we delve into the WNN paradigm from the perspective of reliability and fault tolerance. Through a series of fault injection experiments, we illustrate that WNNs exhibit remarkable resilience to both transient (soft) errors and permanent faults. Notably, WNN models experience minimal deterioration in accuracy even when subjected to fault rates of up to 5%. This resilience makes them well-suited for implementation in emerging memory technologies for binary or multiple bits-per-cell storage with reduced reliance on memory block-level error resilience features. By offering a novel perspective on neural network modeling and highlighting the robustness of WNNs, this research contributes to the broader understanding of fault tolerance in neural networks, particularly in the context of emerging memory technologies.
Agradecimentos/Acknowledgements
--
Palavras-chave
Registos de financiamentos
Referência de financiamento Entidade Financiadora
#2326894 National Science Foundation (NSF)
UIDB/50008/2020 Fundação para a Ciência e a Tecnologia
DSAIPA/AI/0122/2020 Fundação para a Ciência e a Tecnologia
UIDP/04466/2020 Fundação para a Ciência e a Tecnologia
UIDB/04466/2020 Fundação para a Ciência e a Tecnologia
#C645463824-00000063 Comissão Europeia
#2326895 National Science Foundation (NSF)