Comunicação em evento científico
Weightless Neural Networks - a lightweight approach for efficient Machine Learning
Maurício Breternitz (M.Breternitz); Felipe Maia Galvão França (Felipe Franca); Priscila Lima (Priscila Lima);
Título Evento
Seminar Series - CMM Center for Mathematical Morphology- Paris Tech
Ano (publicação definitiva)
2022
Língua
Inglês
País
França
Mais Informação
Web of Science®

Esta publicação não está indexada na Web of Science®

Scopus

Esta publicação não está indexada na Scopus

Google Scholar

N.º de citações: 0

(Última verificação: 2025-12-05 17:55)

Ver o registo no Google Scholar

Esta publicação não está indexada no Overton

Abstract/Resumo
Weightless neural networks (WNNs) are a type of neural model which use random access memory (RAM) to determine neuron activation, as opposed to weights and dot products commonly used in modern deep learning approaches. This makes WNN attractive for a class of applications wherein the computational load (and energy requirements) is limited due to device size, cost or battery lifetimes. Furthermore, WNN have a large VC dimension, indicating that the lowered costs do not imply in a reduction of classification ability. The Vapnik–Chervonenkis (VC) dimension measures the complexity of the knowledge represented by a set of functions that can be encoded by a binary classification algorithm. Previous work demonstrates that the VC dimension of WiSARD is very large indicating a large capacity for discrimination at reduced resource costs.
Agradecimentos/Acknowledgements
--
Palavras-chave
neural networks,random access memory
  • Ciências da Computação e da Informação - Ciências Naturais