Exportar Publicação

A publicação pode ser exportada nos seguintes formatos: referência da APA (American Psychological Association), referência do IEEE (Institute of Electrical and Electronics Engineers), BibTeX e RIS.

Exportar Referência (APA)
Susskind, Z., Arora, A., Bacellar, A., Dutra, D. L. C., Miranda, I. D. S., Breternitz Jr., M....John, L. K. (2023). An FPGA-based weightless neural network for edge network intrusion detection. In Ienne, P., and Zhang, Z. (Ed.), FPGA '23: Proceedings of the 2023 ACM/SIGDA International Symposium on Field Programmable Gate Arrays. Monterey CA USA: Association for Computing Machinery.
Exportar Referência (IEEE)
Z. Susskind et al.,  "An FPGA-based weightless neural network for edge network intrusion detection", in FPGA '23: Proc. of the 2023 ACM/SIGDA Int. Symp. on Field Programmable Gate Arrays, Ienne, P., and Zhang, Z., Ed., Monterey CA USA, Association for Computing Machinery, 2023
Exportar BibTeX
@inproceedings{susskind2023_1732209788600,
	author = "Susskind, Z. and Arora, A. and Bacellar, A. and Dutra, D. L. C. and Miranda, I. D. S. and Breternitz Jr., M. and Lima, P. M. V. and França, F. M. G. and John, L. K.",
	title = "An FPGA-based weightless neural network for edge network intrusion detection",
	booktitle = "FPGA '23: Proceedings of the 2023 ACM/SIGDA International Symposium on Field Programmable Gate Arrays",
	year = "2023",
	editor = "Ienne, P., and Zhang, Z.",
	volume = "",
	number = "",
	series = "",
	doi = "10.1145/3543622.3573140",
	publisher = "Association for Computing Machinery",
	address = "Monterey CA USA",
	organization = "Association for Computing Machinery",
	url = "https://dl.acm.org/doi/proceedings/10.1145/3543622"
}
Exportar RIS
TY  - CPAPER
TI  - An FPGA-based weightless neural network for edge network intrusion detection
T2  - FPGA '23: Proceedings of the 2023 ACM/SIGDA International Symposium on Field Programmable Gate Arrays
AU  - Susskind, Z.
AU  - Arora, A.
AU  - Bacellar, A.
AU  - Dutra, D. L. C.
AU  - Miranda, I. D. S.
AU  - Breternitz Jr., M.
AU  - Lima, P. M. V.
AU  - França, F. M. G.
AU  - John, L. K.
PY  - 2023
DO  - 10.1145/3543622.3573140
CY  - Monterey CA USA
UR  - https://dl.acm.org/doi/proceedings/10.1145/3543622
AB  - The last decade has seen an explosion in the number of networked edge and Internet-of-Things (IoT) devices, a trend which shows no signs of slowing. Concurrently, networking is increasingly moving away from centralized cloud servers and towards base stations and the edge devices themselves, with the objective of decreasing latency and improving the user experience. ASICs typically lack the flexibility needed to update algorithms or adapt to specific
user scenarios, which is becoming increasingly important with the emergence of 6G. While FPGAs have the potential to address these issues, their inferior energy and area efficiency and high unit cost mean that FPGA-based designs must be very aggressively optimized to be viable.
In this paper, we propose FWIW, a novel FPGA-based solution for detecting anomalous or malicious network traffic on edge devices. While prior work in this domain is based on conventional deep neural networks (DNNs), FWIW incorporates a weightless neural network (WNN), a table lookup-based model which learns sophisticated nonlinear behaviors. This allows FWIW to achieve accuracy far superior to prior FPGA-based work at a very small fraction of
the model footprint, enabling deployment on edge devices. FWIW achieves a prediction accuracy of 98.5% on the UNSW-NB15 dataset with a total model parameter size of just 192 bytes, reducing error by 7.9x and model size by 262x vs. the prior work. Implemented on a Xilinx Virtex UltraScale+ FPGA, FWIW demonstrates a 59x reduction in LUT usage with a 1.6x increase in throughput. The accuracy of FWIW comes within 0.6% of the best-reported result in literature, a model several orders of magnitude larger. Our results make it clear that WNNs are worth exploring in the emerging domain of edge networking, and suggest that FPGAs are capable of providing the extreme throughput needed.
ER  -