Scientific journal paper Q1
Memory-efficient DRASiW models
Otávio Napoli (Napoli, O. O.); Ana de Almeida (Almeida, A. M. de.); Edson Borin (Borin, E.); Maurício Breternitz (Breternitz Jr., M.);
Journal Title
Neurocomputing
Year (definitive publication)
2024
Language
English
Country
United Kingdom
More Information
Web of Science®

Times Cited: 0

(Last checked: 2024-10-16 21:36)

View record in Web of Science®

Scopus

Times Cited: 0

(Last checked: 2024-10-14 03:32)

View record in Scopus

Google Scholar

Times Cited: 0

(Last checked: 2024-10-12 22:46)

View record in Google Scholar

Abstract
Weightless Neural Networks (WNN) are ideal for Federated Learning due to their robustness and computational efficiency. These scenarios require models with a small memory footprint and the ability to aggregate knowledge from multiple models. In this work, we demonstrate the effectiveness of using Bloom filter variations to implement DRASiW models—an adaptation of WNN that records both the presence and frequency of patterns—with minimized memory usage. Across various datasets, DRASiW models show competitive performance compared to models like Random Forest, -Nearest Neighbors, Multi-layer Perceptron, and Support Vector Machines, with an acceptable space trade-off. Furthermore, our findings indicate that Bloom filter variations, such as Count Min Sketch, can reduce the memory footprint of DRASiW models by up to 27% while maintaining performance and enabling distributed and federated learning strategies.
Acknowledgements
This work was partially supported by Fundação para a Ciência e a Tecnologia, I.P. (FCT) [ISTAR Projects: UIDB/04466/2020 and UIDP/04466/2020] and DSAIPA/AI/0122/2020 Aim Health. The authors would also like to thank CNPq (404087/2021-3 and 315399/2023-6)
Keywords
Bloom filters,DRASiW,Knowledge aggregation,Weightless neural network
  • Computer and Information Sciences - Natural Sciences
Funding Records
Funding Reference Funding Entity
UIDP/04466/2020 Fundação para a Ciência e a Tecnologia
2013/08293-7 Fapesp
04087/2021-3 CNPq
DSAIPA/AI/0122/2020 Fundação para a Ciência e a Tecnologia
UIDB/04466/2020 Fundação para a Ciência e a Tecnologia
315399/2023-6 CNPq