Talk
Efficiency and Scalability of Multi-Lane Capsule Networks (MLCN)
Maurício Breternitz (M.Breternitz);
Event Title
CIENCIA 2019 Encontro com a Ciencia e Tecnologia em Portugal
Year (definitive publication)
2019
Language
Portuguese
Country
Portugal
More Information
Web of Science®

This publication is not indexed in Web of Science®

Scopus

Times Cited: 4

(Last checked: 2022-02-17 09:54)

View record in Scopus

Google Scholar

This publication is not indexed in Google Scholar

This publication is not indexed in Overton

Abstract
Some Deep Neural Networks (DNN) have what we call lanes, or they can be reorganized as such. Lanes are paths in the network which are data-independent and typically learn different features or add resilience to the network. Given their data-independence, lanes are amenable for parallel processing. The Multi-lane CapsNet (MLCN) is a proposed reorganization of the Capsule Network which is shown to achieve better accuracy while bringing highly-parallel lanes. However, the efficiency and scalability of MLCN had not been systematically examined. In this work, we study the MLCN network with multiple GPUs finding that it is 2x more efficient than the original CapsNet when using model-parallelism. Further, we present the load balancing problem of distributing heterogeneous lanes in homogeneous or heterogeneous accelerators and show that a simple greedy heuristic can be almost 50% faster than a na¨ıve random approach.
Acknowledgements
--
Keywords
  • Computer and Information Sciences - Natural Sciences