Exportar Publicação
A publicação pode ser exportada nos seguintes formatos: referência da APA (American Psychological Association), referência do IEEE (Institute of Electrical and Electronics Engineers), BibTeX e RIS.
Napoli, O. O., Almeida, A. M. de., Borin, E. & Breternitz Jr., M. (2024). Memory-efficient DRASiW models. Neurocomputing. 610
O. O. Napoli et al., "Memory-efficient DRASiW models", in Neurocomputing, vol. 610, 2024
@article{napoli2024_1731881898868, author = "Napoli, O. O. and Almeida, A. M. de. and Borin, E. and Breternitz Jr., M.", title = "Memory-efficient DRASiW models", journal = "Neurocomputing", year = "2024", volume = "610", number = "", doi = "10.1016/j.neucom.2024.128443", url = "https://www.sciencedirect.com/journal/neurocomputing" }
TY - JOUR TI - Memory-efficient DRASiW models T2 - Neurocomputing VL - 610 AU - Napoli, O. O. AU - Almeida, A. M. de. AU - Borin, E. AU - Breternitz Jr., M. PY - 2024 SN - 0925-2312 DO - 10.1016/j.neucom.2024.128443 UR - https://www.sciencedirect.com/journal/neurocomputing AB - Weightless Neural Networks (WNN) are ideal for Federated Learning due to their robustness and computational efficiency. These scenarios require models with a small memory footprint and the ability to aggregate knowledge from multiple models. In this work, we demonstrate the effectiveness of using Bloom filter variations to implement DRASiW models—an adaptation of WNN that records both the presence and frequency of patterns—with minimized memory usage. Across various datasets, DRASiW models show competitive performance compared to models like Random Forest, -Nearest Neighbors, Multi-layer Perceptron, and Support Vector Machines, with an acceptable space trade-off. Furthermore, our findings indicate that Bloom filter variations, such as Count Min Sketch, can reduce the memory footprint of DRASiW models by up to 27% while maintaining performance and enabling distributed and federated learning strategies. ER -