Weightless Neural Networks - a lightweight approach for efficient Machine Learning
Event Title
Seminar Series - CMM Center for Mathematical Morphology- Paris Tech
Year (definitive publication)
2022
Language
English
Country
France
More Information
Web of Science®
This publication is not indexed in Web of Science®
Scopus
This publication is not indexed in Scopus
Google Scholar
This publication is not indexed in Overton
Abstract
Weightless neural networks (WNNs) are a type of neural model
which use random access memory (RAM) to determine neuron activation, as opposed to weights and dot products commonly used in modern deep learning approaches. This makes WNN attractive for a class of applications wherein the computational load (and energy requirements) is limited due to device size, cost or battery lifetimes. Furthermore, WNN have a large VC dimension, indicating that the lowered costs do not imply in a reduction of classification ability. The Vapnik–Chervonenkis (VC) dimension measures the complexity of the knowledge represented by a set of functions that can be
encoded by a binary classification algorithm. Previous work demonstrates that the VC dimension of WiSARD is very
large indicating a large capacity for discrimination at reduced resource costs.
Acknowledgements
--
Keywords
neural networks,random access memory
Fields of Science and Technology Classification
- Computer and Information Sciences - Natural Sciences
Português