Exportar Publicação

A publicação pode ser exportada nos seguintes formatos: referência da APA (American Psychological Association), referência do IEEE (Institute of Electrical and Electronics Engineers), BibTeX e RIS.

Exportar Referência (APA)
Napoli, O. O., Almeida, A. M. de., Dias, J. M. S., Rosário, L. B., Borin, E. & Breternitz Jr, M. (2023). Efficient knowledge aggregation methods for weightless neural networks. In Proceedings of the 31th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2023). (pp. 369-374). Bruges, Belgium: ESANN.
Exportar Referência (IEEE)
O. O. Napoli et al.,  "Efficient knowledge aggregation methods for weightless neural networks", in Proc. of the 31th European Symp. on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2023), Bruges, Belgium, ESANN, 2023, pp. 369-374
Exportar BibTeX
@inproceedings{napoli2023_1715458410760,
	author = "Napoli, O. O. and Almeida, A. M. de. and Dias, J. M. S. and Rosário, L. B. and Borin, E. and Breternitz Jr, M.",
	title = "Efficient knowledge aggregation methods for weightless neural networks",
	booktitle = "Proceedings of the 31th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2023)",
	year = "2023",
	editor = "",
	volume = "",
	number = "",
	series = "",
	doi = "10.14428/esann/2023.ES2023-123",
	pages = "369-374",
	publisher = "ESANN",
	address = "Bruges, Belgium",
	organization = "",
	url = "https://www.esann.org/proceedings/2023"
}
Exportar RIS
TY  - CPAPER
TI  - Efficient knowledge aggregation methods for weightless neural networks
T2  - Proceedings of the 31th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2023)
AU  - Napoli, O. O.
AU  - Almeida, A. M. de.
AU  - Dias, J. M. S.
AU  - Rosário, L. B.
AU  - Borin, E.
AU  - Breternitz Jr, M.
PY  - 2023
SP  - 369-374
DO  - 10.14428/esann/2023.ES2023-123
CY  - Bruges, Belgium
UR  - https://www.esann.org/proceedings/2023
AB  - Weightless Neural Networks (WNN) are good candidates for Federated Learning scenarios due to their robustness and computational lightness. In this work, we show that it is possible to aggregate the knowledge of multiple WNNs using more compact data structures, such as Bloom Filters, to reduce the amount of data transferred between devices. Finally, we explore variations of Bloom Filters and found that a particular data-structure, the Count-Min Sketch (CMS), is a good candidate for aggregation. Costing at most 3% of accuracy, CMS can be up to 3x smaller
when compared to previous approaches, specially for large datasets.
ER  -